Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Glenn M. Schuster is active.

Publication


Featured researches published by Glenn M. Schuster.


Computational Optical Sensing and Imaging | 2013

Fiber-coupled monocentric lens imaging

Joseph E. Ford; Igor Stamenov; Stephen J. Olivas; Glenn M. Schuster; Nojan Motamedi; Ilya Agurok; Ron A. Stack; Adam Johnson; Rick L. Morrison

Monocentric lenses have proven exceptionally capable of high numerical aperture wide-field imaging - provided the overall system can accommodate a spherically curved image surface. We will present a summary of recent work on the design optimization and experimental demonstrations of monocentric wide-field imaging, including systems based on waveguide coupling of the image to conventional focal plane sensor(s).


computer vision and pattern recognition | 2017

A Wide-Field-of-View Monocentric Light Field Camera

Donald G. Dansereau; Glenn M. Schuster; Joseph E. Ford; Gordon Wetzstein

Light field (LF) capture and processing are important in an expanding range of computer vision applications, offering rich textural and depth information and simplification of conventionally complex tasks. Although LF cameras are commercially available, no existing device offers wide field-of-view (FOV) imaging. This is due in part to the limitations of fisheye lenses, for which a fundamentally constrained entrance pupil diameter severely limits depth sensitivity. In this work we describe a novel, compact optical design that couples a monocentric lens with multiple sensors using microlens arrays, allowing LF capture with an unprecedented FOV. Leveraging capabilities of the LF representation, we propose a novel method for efficiently coupling the spherical lens and planar sensors, replacing expensive and bulky fiber bundles. We construct a single-sensor LF camera prototype, rotating the sensor relative to a fixed main lens to emulate a wide-FOV multi-sensor scenario. Finally, we describe a processing toolchain, including a convenient spherical LF parameterization, and demonstrate depth estimation and post-capture refocus for indoor and outdoor panoramas with 15 x 15 x 1600 x 200 pixels (72 MPix) and a 138° FOV.


Proceedings of SPIE | 2014

Digital image processing for wide-angle highly spatially variant imagers

Stephen J. Olivas; Michal Šorel; Ashkan Arianpour; Igor Stamenov; Nima Nikzad; Glenn M. Schuster; Nojan Motamedi; William M. Mellette; Ron A. Stack; Adam Johnson; Rick L. Morrison; Ilya Agurok; Joseph E. Ford

High resolution, wide field-of-view and large depth-of-focus imaging systems are greatly desired and have received much attention from researchers who seek to extend the capabilities of cameras. Monocentric lenses are superior in performance over other wide field-of-view lenses with the drawback that they form a hemispheric image plane which is incompatible with current sensor technology. Fiber optic bundles can be used to relay the image the lens produces to the sensors planar surface. This requires image processing to correct for artifacts inherent to fiber bundle image transfer. Using a prototype fiber coupled monocentric lens imager we capture single exposure focal swept images from which we seek to produce extended depth-of-focus images. Point spread functions (PSF) were measured in lab and found to be both angle and depth dependent. This spatial variance enforces the requirement that the inverse problem be treated as such. This synthesis of information allowed us to establish a framework upon which to mitigate fiber bundle artifacts and extend the depth-of-focus of the imaging system.


Optics Express | 2014

Planar waveguide LED illuminator with controlled directionality and divergence

William M. Mellette; Glenn M. Schuster; Joseph E. Ford

We present a versatile illumination system where white light emitting diodes are coupled through a planar waveguide to periodically patterned extraction features at the focal plane of a two dimensional lenslet array. Adjusting the position of the lenslet array allows control over both the directionality and divergence of the emitted beam. We describe an analytic design process, and show optimal designs can achieve high luminous emittance (1.3x10⁴ lux) over a 2x2 foot aperture with over 75% optical efficiency while simultaneously allowing beam steering over ± 60° and divergence control from ± 5° to fully hemispherical output. Finally, we present experimental results of a prototype system which validate the design model.


optical fiber communication conference | 2016

61 Port 1×6 selector switch for data center networks

William M. Mellette; Glenn M. Schuster; George Porter; Joseph E. Ford

We present design and preliminary characterization of a scalable MEMS-based “selector switch” for high performance computing networks. The 170 μs, 61-port prototype uses relay image steering to route all 61 SMF channels through one of six pre-structured interconnects.


Spie Newsroom | 2016

Panoramic full-frame imaging with monocentric lenses and curved fiber bundles

Joseph E. Ford; Salman Karbasi; Ilya Agurok; Igor Stamenov; Glenn M. Schuster; Nojan Motamedi; Ash Arianpour; William M. Mellette; Adam Johnson; Ryan Tennill; Rick L. Morrison; Ron A. Stack

Panoramic imaging is important for many different applications, including content for immersive virtual reality. Although compact 360 cameras can be made from an array of small-aperture ‘smartphone’ imagers, their small (typically 1.1 m) pixels provide low dynamic range. Moreover, digital single-lens-reflex and cinematographic imagers have 4–8 m pixels, but require correspondingly longer focal length lenses. Conventional ‘fisheye’ lenses are also problematic because they are bulky and have low light collection (typically F/2.8 to F/4, where F is the focal length divided by the lens aperture). An alternative path to panoramic imaging is ‘monocentric’ optics, where all surfaces—including the image surface—are concentric hemispheres.1 The symmetry of these lenses means that lateral color and off-axis aberrations (astigmatism and coma) are eliminated. In addition, the simple lens structures can be used to correct for spherical and axial color aberrations to yield extraordinarily wide angle resolution and light collection.2 The image that is produced can be coupled to a conventional focal plane, via a fiber bundle faceplate (with a curved input and flat output face).3 Fiber faceplates are solid glass elements made of small, high-index optical fibers separated by a thin, low-index cladding, used for nonimaging transfer of light between the input and output faces. From our research, within the Defense Advanced Research Projects Agency (DARPA) SCENICC (Soldier Centric Imaging via Computational Cameras) program, we have shown that fiber bundles can reach a spatial resolution of 2 m.4 We have also Figure 1. Geometry of a monocentric lens (left) and the spherical image surface it forms (right) can be coupled to CMOS focal plane(s) by an array of straight fiber bundles (top) or a single curved fiber bundle (bottom). The F-number is the focal length (f) divided by the lens aperture.


Renewable Energy and the Environment (2013), paper DT3E.3 | 2013

Planar Waveguide Illuminator with Variable Directionality and Divergence

William M. Mellette; Glenn M. Schuster; Ilya Agurok; Joseph E. Ford

We present the design, model, and experimental characterization of a white light LED illuminator using mechanical actuation of a lenslet array relative to a micro-structured planar waveguide to control the divergence and direction of emitted light.


Journal of Lightwave Technology | 2017

A Scalable, Partially Configurable Optical Switch for Data Center Networks

William M. Mellette; Glenn M. Schuster; George Porter; George Papen; Joseph E. Ford


Applied Optics | 2015

Wink-controlled polarization-switched telescopic contact lenses.

Glenn M. Schuster; Ashkan Arianpour; Scott Cookson; Arthur Zhang; Hendrik L; O'Brien T; Alvarez A; Joseph E. Ford


Computational Optical Sensing and Imaging | 2014

Fiber Bundle Image Relay for Monocentric Lenses

Stephen J. Olivas; Nima Nikzad; Igor Stamenov; Ashkan Arianpour; Glenn M. Schuster; Nojan Motamedi; William M. Mellette; Ronald A. Stack; Adam R. Johnson; Rick L. Morrison; Ilya Agurok; Joseph E. Ford

Collaboration


Dive into the Glenn M. Schuster's collaboration.

Top Co-Authors

Avatar

Joseph E. Ford

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Igor Stamenov

University of California

View shared research outputs
Top Co-Authors

Avatar

Ilya Agurok

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Arthur Zhang

University of California

View shared research outputs
Top Co-Authors

Avatar

Nojan Motamedi

University of California

View shared research outputs
Top Co-Authors

Avatar

Scott Cookson

University of California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge