Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Steven Bathiche is active.

Publication


Featured researches published by Steven Bathiche.


user interface software and technology | 2009

A practical pressure sensitive computer keyboard

Paul Henry Dietz; Benjamin David Eidelson; Jonathan Westhues; Steven Bathiche

A pressure sensitive computer keyboard is presented that independently senses the force level on every depressed key. The design leverages existing membrane technologies and is suitable for low-cost, high-volume manufacturing. A number of representative applications are discussed.


Optics Express | 2009

Collimated light from a waveguide for a display backlight

Adrian Travis; Timothy Large; Neil Emerton; Steven Bathiche

We report light collimation from a point source without the space normally needed for fan-out. Rays emerge uniformly from all parts of the surface of a blunt wedge light-guide when a point source of light is placed at the thin end and the sources position determines ray direction in the manner of a lens. A lenticular array between this light-guide and a liquid crystal panel guides light from color light-emitting diodes to designated sub-pixels thereby removing the need for color filters and halving power consumption but we foresee much greater power economies and wider application.


international symposium on mixed and augmented reality | 2013

MonoFusion: Real-time 3D reconstruction of small scenes with a single web camera

Vivek Pradeep; Christoph Rhemann; Shahram Izadi; Christopher Zach; Michael Bleyer; Steven Bathiche

MonoFusion allows a user to build dense 3D reconstructions of their environment in real-time, utilizing only a single, off-the-shelf web camera as the input sensor. The camera could be one already available in a tablet, phone, or a standalone device. No additional input hardware is required. This removes the need for power intensive active sensors that do not work robustly in natural outdoor lighting. Using the input stream of the camera we first estimate the 6DoF camera pose using a sparse tracking method. These poses are then used for efficient dense stereo matching between the input frame and a key frame (extracted previously). The resulting dense depth maps are directly fused into a voxel-based implicit model (using a computationally inexpensive method) and surfaces are extracted per frame. The system is able to recover from tracking failures as well as filter out geometrically inconsistent noise from the 3D reconstruction. Our method is both simple to implement and efficient, making such systems even more accessible. This paper details the algorithmic components that make up our system and a GPU implementation of our approach. Qualitative results demonstrate high quality reconstructions even visually comparable to active depth sensor-based systems such as KinectFusion.


Optics Express | 2010

Image capture via a wedge light-guide with no margins

Adrian Travis; Timothy Large; Neil Emerton; Zhaoming Zhu; Steven Bathiche

We report the capture of images via a wedge light-guide without the margin for fan-in needed heretofore. While this lets one look out of a slim panel as if it were a periscope, half the power is lost and resolution is degraded by aperture diffraction. Volume gratings may resolve these drawbacks at certain wavelengths and we consider how these might be extruded.


International Journal of Human-computer Interaction | 2009

Hardware Support for Navigating Large Digital Documents

Nicholas Chen; François Guimbretière; Liyang Sun; Mary Czerwinski; Gian Pangaro; Steven Bathiche

Buxton and Myers (1986) compared a specialized document navigation device and a scrollbar. Their device tracked finger movements along two touch-sensitive areas: one for absolute movement and one for relative movement. They found that their system was faster when navigating a 5-page document. We identified finger-tracking issues with the Buxton and Myers device for larger documents (100 pages) and developed an alternative device employing a range sensor and rotary encoder to track finger movement. We ran a new experiment comparing the traditional scrollbar, the Buxton and Myers design, and our new design. Both the Buxton and Myers design and our new design were poorly received by users compared to the scrollbar. Our results indicate that in a large document, factors beyond finger tracking accuracy influence the performance of a device providing absolute movement. From these results, we identify possible improvements necessary to implement effective and practical absolute navigation devices.


user interface software and technology | 2011

Scopemate: a robotic microscope

Cati N. Boulanger; Paul Henry Dietz; Steven Bathiche

Scopemate is a robotic microscope that tracks the user for inspection microscopy. The novel input device combines an optically augmented web-cam with a head tracker. A head tracker controls the inspection angle of a webcam fitted with appropriate microscope optics. This allows an operator the full use of their hands while intuitively looking at the work area from different perspectives.


SID Symposium Digest of Technical Papers | 2011

71.4: Invited Paper: Imaging via Backlights

Adrian Travis; Neil Emerton; Timothy Large; Liying Chen; Steven Bathiche

We have used a wedge light guide to capture images in front of an LCD as if from a point deep behind. We have also used it to illuminate an LCD so as to be visible to only one viewer. These two features offer the prospect of a digital window.


SID Symposium Digest of Technical Papers | 2010

16.2: Backlight for ViewSequential Autostereo 3D

Adrian Travis; Neil Emerton; Timothy Large; Steven Bathiche; Bernie Rihn

We have demonstrated a backlight which emits collimated light whose direction can be scanned through 16°. Combined with a high frame rate LCD, this lets us display stereo 3D with no need for glasses.


user interface software and technology | 2011

Scopemate: a tracking inspection microscope

Cati N. Boulanger; Paul Henry Dietz; Steven Bathiche

We propose a new interaction mechanism for inspection microscopy. The novel input device combines an optically augmented web-cam with a head tracker. A head tracker controls the inspection angle of a webcam fitted with ap-propriate microscope optics. This allows an operator the full use of their hands while intuitively looking at the work area from different perspectives.


Archive | 2004

Manual controlled scrolling

Ken Hinckley; Steven Bathiche; James H. Cauthorn; Michael J. Sinclair

Collaboration


Dive into the Steven Bathiche's collaboration.

Researchain Logo
Decentralizing Knowledge