Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gregory F. Welch is active.

Publication


Featured researches published by Gregory F. Welch.


Presence: Teleoperators & Virtual Environments | 2009

History: The use of the kalman filter for human motion tracking in virtual reality

Gregory F. Welch

In 1960 Rudolph E. Kalman published his now famous article describing a recursive solution to the discrete-data linear filtering problem (Kalman, A new approach to linear filtering and prediction problems, Transactions of the ASMEJournal of Basic Engineering, 82 (D), 3545, 1960). Since that time, due in large part to advances in digital computing, the Kalman filter has been the subject of extensive research and applications, particularly in the area of autonomous or assisted navigation. The purpose of this paper is to acknowledge the approaching 50th anniversary of the Kalman filter with a look back at the use of the filter for human motion tracking in virtual reality (VR) and augmented reality (AR). In recent years there has been an explosion in the use of the Kalman filter in VR/AR. In fact, at technical conferences related to VR these days, it would be unusual to see a paper on tracking that did not use some form of a Kalman filter, or draw comparisons to those that do. As such, rather than attempt a comprehensive survey of all uses of the Kalman filter to date, what follows focuses primarily on the early discovery and subsequent period of evolution of the Kalman filter in VR, along with a few examples of modern commercial systems that use the Kalman filter. This paper begins with a very brief introduction to the Kalman filter, a brief look at the origins of VR, a little about tracking in VRin particular the work and conditions that gave rise to the use of the filter, and then the evolution of the use of the filter in VR.


Operating Systems Review | 1995

A survey of power management techniques in mobile computing operating systems

Gregory F. Welch

Many factors have contributed to the birth and continued growth of mobile computing, including recent advances in hardware and communications technology. With this new paradigm however come new challenges in computer operating systems development. These challenges include heretofore relatively unusual items such as frequent network disconnections, communications bandwidth limitations, resource restrictions, and power limitations. It is the last of these challenges that we shall explore in this paper---that is the question of what techniques can be employed in mobile computer operating systems that can reduce the power consumption of todays mobile computing devices.


ieee/ion position, location and navigation symposium | 2014

Development of Vision-aided Navigation for a Wearable Outdoor Augmented Reality System

Alberico Menozzi; Brian Clipp; Eric Wenger; Jared Heinly; Enrique Dunn; Herman Towles; Jan Michael Frahm; Gregory F. Welch

This paper describes the development of vision-aided navigation (i.e., pose estimation) for a wearable augmented reality system operating in natural outdoor environments. This system combines a novel pose estimation capability, a helmet-mounted see-through display, and a wearable processing unit to accurately overlay geo-registered graphics on the users view of reality. Accurate pose estimation is achieved through integration of inertial, magnetic, GPS, terrain elevation data, and computervision inputs. Specifically, a helmet-mounted forward-looking camera and custom computer vision algorithms are used to provide measurements of absolute orientation (i.e., orientation of the helmet with respect to the Earth). These orientation measurements, which leverage mountainous terrain horizon geometry and/or known landmarks, enable the system to achieve significant improvements in accuracy compared to GPS/INS solutions of similar size, weight, and power, and to operate robustly in the presence of magnetic disturbances. Recent field testing activities, across a variety of environments where these vision-based signals of opportunity are available, indicate that high accuracy (less than 10 mrad) in graphics geo-registration can be achieved. This paper presents the pose estimation process, the methods behind the generation of vision-based measurements, and representative experimental results.


American Journal of Orthodontics and Dentofacial Orthopedics | 2012

Three-dimensional evaluation of changes in lip position from before to after orthodontic appliance removal

Lindsey Eidson; Lucia Helena Soares Cevidanes; Leonardo Koerich de Paula; H. Garland Hershey; Gregory F. Welch; P. Emile Rossouw

INTRODUCTION Our objectives were to develop a reproducible method of superimposing 3-dimensional images for measuring soft-tissue changes over time and to use this method to document changes in lip position after the removal of orthodontic appliances. METHODS Three-dimensional photographs of 50 subjects were made in repose and maximum intercuspation before and after orthodontic appliance removal with a stereo camera. For reliability assessment, 2 photographs were repeated for 15 patients. The images were registered on stable areas, and surface-to-surface measurements were made for defined landmarks. RESULTS Mean changes were below the level of clinical significance (set at 1.5 mm). However, 51% and 18% of the subjects experienced changes greater than 1.5 mm at the commissures and lower lips, respectively. CONCLUSIONS The use of serial 3-dimensional photographs is a reliable method of documenting soft-tissue changes. Soft-tissue changes after appliance removal are not clinically significant; however, there is great individual variability.


british machine vision conference | 2007

Structure from Motion via Two-State Pipeline of Extended Kalman Filters.

Brian Clipp; Gregory F. Welch; Jan Michael Frahm; Marc Pollefeys

We introduce a novel approach to on-line structure from motion, using a pipelined pair of extended Kalman filters to improve accuracy with a minimal increase in computational cost. The two filters, a leading and a following filter, run concurrently on the same measurements in a synchronized producer-consumer fashion, but offset from each other in time. The leading filter estimates structure and motion using all of the available measurements from an optical flow based 2D tracker, passing the best 3D feature estimates, covariances, and associated measurements to the following filter, which runs several steps behind. This pipelined arrangement introduces a degree of noncausal behavior, effectively giving the following filter the benefit of decisions and estimates made several steps ahead. This means that the following filter works with only the best features, and can begin full 3D estimation from the very start of the respective 2D tracks. We demonstrate a reduction of more than 50% in mean reprojection errors using this approach on real data.


Virtual Realities | 2015

Applications of Avatar Mediated Interaction to Teaching, Training, Job Skills and Wellness

Charles E. Hughes; Arjun Nagendran; Lisa A. Dieker; Michael C. Hynes; Gregory F. Welch

The focus of this chapter is on the application of a framework for remotely delivering role-playing experiences that afford users the opportunity to practice real-world skills in a safe virtual setting. The framework, AMITIES, provides a single individual the capabilities to remotely orchestrate the performances of multiple virtual characters. We illustrate this by introducing avatar–enabled scenarios that range from teacher preparation to effectively dealing with complex interpersonal situations such as resistance to peer pressure and participation in job interviews (either as the interviewer or the interviewee).


The Rural Special Education Quarterly | 2015

Virtual Learning Environments for Students with Disabilities: A Review and Analysis of the Empirical Literature and Two Case Studies.

Eleazar Vasquez; Arjun Nagendran; Gregory F. Welch; Matthew T. Marino; Darin E. Hughes; Aaron Koch; Lauren Delisio

Students with autism spectrum disorder (ASD) show varying levels of impairment in social skills situations. Interventions have been developed utilizing virtual environments (VEs) to teach and improve social skills. This article presents a systematic literature review of peer-reviewed journal articles focusing on social interventions in VEs involving K-12th grade students with ASD. This exhaustive analysis across four major online databases was guided by operational terms related to intervention type and K-12 students with ASD. The empirical search yielded a very narrow body of literature (n=19) on the use of VEs as social skill interventions for students with ASD. Two case study examples of experiments exploring the use of VEs and students with ASD are presented to illustrate possible applications of this technology.


ieee virtual reality conference | 2017

Exploring the effect of vibrotactile feedback through the floor on social presence in an immersive virtual environment

Myungho Lee; Gerd Bruder; Gregory F. Welch

We investigate the effect of vibrotactile feedback delivered to ones feet in an immersive virtual environment (IVE). In our study, participants observed a virtual environment where a virtual human (VH) walked toward the participants and paced back and forth within their social space. We compared three conditions as follows: participants in the “Sound” condition heard the footsteps of the VH; participants in the “Vibration” condition experienced the vibration of the footsteps along with the sounds; while participants in the “Mute” condition were not exposed to sound nor vibrotactile feedback. We found that the participants in the “Vibration” condition felt a higher social presence with the VH compared to those who did not feel the vibration. The participants in the “Vibration” condition also exhibited greater avoidance behavior while facing the VH and when the VH invaded their personal space.


Computer Animation and Virtual Worlds | 2017

The effects of virtual human's spatial and behavioral coherence with physical objects on social presence in AR

Kangsoo Kim; Divine Maloney; Gerd Bruder; Jeremy N. Bailenson; Gregory F. Welch

In augmented reality, people can feel the illusion of virtual humans (VHs) integrated into a real (physical) space. However, affordances of the real world and virtual contents might conflict, for example, when the VHs and real objects “collide” by occupying the same space. This implausible conflict can cause a break in presence in real–virtual human interactions. In this paper, we address an effort to avoid this conflict by maintaining the VHs spatial and behavioral coherence with respect to the physical objects or events (e.g., natural occlusions and appropriate help‐requesting behaviors to avoid implausible physical–virtual collisions). We present a human subject experiment examining the effects of the physical–virtual coherence on human perceptions, such as social/copresence and behaviors with the VH. The basic ideas, experimental design, and results supporting the benefit of the VHs spatial and behavioral coherence are presented and discussed.


international symposium on mixed and augmented reality | 2015

Human Perception and Psychology in Augmented Reality (HPPAR) Summary

Bruce H. Thomas; Gregory F. Welch; James Baumeister

Summary form only given. The main thrust of this half-day workshop is the development of the research agenda for human perception and psychology in augmented reality. With the convergence of historical advances in Augment Reality (AR) techniques with small and affordable technologies (e.g., sensors, displays, and input devices) we are on the threshold of AR becoming an effective tool for many applications and even everyday life. Traditional AR research has focused on creating technologies, algorithms, and techniques to achieve demonstrations that we evaluate with engineering benchmarks, such as performance of a tracking system, photorealism of the graphics, and human performance completing a task.

Collaboration


Dive into the Gregory F. Welch's collaboration.

Top Co-Authors

Avatar

Henry Fuchs

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Arjun Nagendran

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar

Gerd Bruder

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar

Kangsoo Kim

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar

Andrei State

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ramesh Raskar

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Andrew Nashel

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Herman Towles

University of North Carolina at Chapel Hill

View shared research outputs
Researchain Logo
Decentralizing Knowledge