Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Greg Dawe is active.

Publication


Featured researches published by Greg Dawe.


Assistive Technology | 2006

Robotics and virtual reality: A perfect marriage for motor control research and rehabilitation

James L. Patton; Greg Dawe; Chris Scharver; Ferdinando A. Mussa-Ivaldi; Robert V. Kenyon

This articles goal is to outline the motivations, progress, and future objectives for the development of a state-of-the-art device that allows humans to visualize and feel synthetic objects superimposed on the physical world. The programming flexibility of these devices allows for a variety of scientific questions to be answered in psychology, neurophysiology, rehabilitation, haptics, and automatic control. The benefits are most probable in rehabilitation of brain-injured patients, for whom the costs are high, therapist time is limited, and repetitive practice of movements has been shown to be beneficial. Moreover, beyond simple therapy that guides, strengthens, or stretches, the technology affords a variety of exciting potential techniques that can combine our knowledge of the nervous system with the tireless, precise, and swift capabilities of a robot. Because this is a prototype, the system will also guide new experimental methods by probing the levels of quality that are necessary for future design cycles and related technology. Very important to the project is the early and intimate involvement of therapists and other clinicians in the design of software and its user interface. Inevitably, it should also lead the way to new modes of practice and to the commercialization of haptic/graphic systems.


international conference of the ieee engineering in medicine and biology society | 2004

Robotics and virtual reality: the development of a life-sized 3-D system for the rehabilitation of motor function

James L. Patton; Greg Dawe; Chris Scharver; Ferdinando A. Mussa-Ivaldi; Robert V. Kenyon

We have been developing and combining state-of-art devices that allow humans to visualize and feel synthetic objects superimposed on the real world. This effort stems from the need of platform for extending experiments on motor control and learning to realistic human motor tasks and environments, not currently represented in the practice of research. This papers goal is to outline our motivations, progress, and objectives. Because the system is a general tool, we also hope to motivate researchers in related fields to join in. The platform under development, an augmented reality system combined with a haptic-interface robot, will be a new tool for contributing to the scientific knowledge base in the area of human movement control and rehabilitation robotics. Because this is a prototype, the system will also guide new methods by probing the levels of quality necessary for future design cycles and related technology. Inevitably, it should also lead the way to commercialization of such systems.


Proceedings of SPIE | 2001

Varrier autostereographic display

Daniel J. Sandin; Todd Margolis; Greg Dawe; Jason Leigh; Thomas A. DeFanti

The goal of this research is to develop a head-tracked, stern virtual reality system utilizing plasma or LCD panels. This paper describes a head-tracked barrier auto-stereographic method that is optimized for real-time interactive virtual reality systems. In this method, virtual barrier screen is created simulating the physical barrier screen, and placed in the virtual world in front of the projection plane. An off- axis perspective projection of this barrier screen, combined with the rest of the virtual world, is projected from at least two viewpoints corresponding to the eye positions of the head- tracked viewer. During the rendering process, the simulated barrier screen effectively casts shadows on the projection plane. Since the different projection points cast shadows at different angles, the different viewpoints are spatially separated on the projection plane. These spatially separated images are projected into the viewers space at different angles by the physical barrier screen. The flexibility of this computational process allows more complicated barrier screens than the parallel opaque lines typically used in barrier strip auto-stereography. In addition this method supports the focusing and steering of images for a users given viewpoint, and allows for very wide angles of view. This method can produce an effective panel-based auto-stereo virtual reality system.


Frontiers of human-centred computing, online communities and virtual environments | 2001

Technologies for virtual reality/tele—immersion applications: issues of research in image display and global networking

Thomas A. DeFanti; Daniel J. Sandin; Maxine D. Brown; Dave Pape; Josephine Anstey; Mike Bogucki; Greg Dawe; Andrew E. Johnson; Thomas S. Huang

The Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago (UIC) has developed an aggressive program over the past decade to partner with scores of computational scientists and engineers all over the world. The focus of this effort has been to create visualization and virtual reality (VR) devices and applications for collaborative exploration of scientific and engineering data. Since 1995, our research and development activities have incorporated emerging high-bandwidth networks like the vBNS and its international connection point STAR TAP, in an effort now calledtele-immersion.


visual communications and image processing | 1998

Next-generation tele-immersive devices for desktop transoceanic collaboration

Andrew E. Johnson; Jason Leigh; Thomas A. DeFanti; Daniel J. Sandin; Maxine D. Brown; Greg Dawe

Tele-Immersion is the combination of collaborative virtual reality and audio/video teleconferencing. With a new generation of high-speed international networks and high-end virtual reality devices spread around the world, effective trans-oceanic tele-immersive collaboration is now possible. But in order to make these shared virtual environments more convenient workspaces, a new generation of desktop display technology is needed.


Archive | 2000

Video-Based Measurement of System Latency

Ding He; Fuhu Liu; Dave Pape; Greg Dawe; Daniel J. Sandin


Archive | 2001

AGAVE : Access Grid Augmented Virtual Environment

Jason Leigh; Greg Dawe; Jonas Talandis; Eric He; Shalini Venkataraman; Jinghua Ge; Daniel J. Sandin; Thomas A. DeFanti


INET | 2000

AccessBot: an Enabling Technology for Telepresence

Jason Leigh; Maggie Rawlings; Javier Girado; Greg Dawe; Ray Fang; Alan Verlo; Muhammad-Ali Khan; Alan Cruz; Dana Plepys; Daniel J. Sandin; Thomas A. DeFanti


Storage and Retrieval for Image and Video Databases | 2002

Low-cost projection-based virtual reality display

David E. Pape; Josephine Anstey; Greg Dawe


Archive | 1999

The ImmersaDesk3 - Experiences With A Flat Panel Display for Virtual Reality

Dave Pape; Josephine Anstey; Mike Bogucki; Greg Dawe; Thomas A. DeFanti; Andrew E. Johnson; Daniel J. Sandin

Collaboration


Dive into the Greg Dawe's collaboration.

Top Co-Authors

Avatar

Daniel J. Sandin

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jason Leigh

University of Hawaii at Manoa

View shared research outputs
Top Co-Authors

Avatar

Andrew E. Johnson

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Chris Scharver

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Dave Pape

University at Buffalo

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

James L. Patton

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robert V. Kenyon

University of Illinois at Chicago

View shared research outputs
Researchain Logo
Decentralizing Knowledge