Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Igor Stamenov is active.

Publication


Featured researches published by Igor Stamenov.


Applied Optics | 2012

Optimization of two-glass monocentric lenses for compact panoramic imagers: general aberration analysis and specific designs.

Igor Stamenov; Ilya Agurok; Joseph E. Ford

Monocentric lenses have recently changed from primarily a historic curiosity to a potential solution for panoramic high-resolution imagers, where the spherical image surface is directly detected by curved image sensors or optically transferred onto multiple conventional flat focal planes. We compare imaging and waveguide-based transfer of the spherical image surface formed by the monocentric lens onto planar image sensors, showing that both approaches can make the system input aperture and resolution substantially independent of the input angle. We present aberration analysis that demonstrates that wide-field monocentric lenses can be focused by purely axial translation and describe a systematic design process to identify the best designs for two-glass symmetric monocentric lenses. Finally, we use this approach to design an F/1.7, 12 mm focal length imager with an up to 160° field of view and show that it compares favorably in size and performance to conventional wide-angle imagers.


Optics Express | 2013

Switchable telescopic contact lens

Eric J. Tremblay; Igor Stamenov; R. Dirk Beer; Ashkan Arianpour; Joseph E. Ford

We present design and first demonstration of optics for a telescopic contact lens with independent optical paths for switching between normal and magnified vision. The magnified optical path incorporates a telescopic arrangement of positive and negative annular concentric reflectors to achieve 2.8 x magnification on the eye, while light passing through a central clear aperture provides unmagnified vision. We present an experimental demonstration of the contact lens mounted on a life-sized optomechanical model eye and, using a pair of modified commercial 3D television glasses, demonstrate electrically operated polarization switching between normal and magnified vision.


Computational Optical Sensing and Imaging | 2013

Fiber-coupled monocentric lens imaging

Joseph E. Ford; Igor Stamenov; Stephen J. Olivas; Glenn M. Schuster; Nojan Motamedi; Ilya Agurok; Ron A. Stack; Adam Johnson; Rick L. Morrison

Monocentric lenses have proven exceptionally capable of high numerical aperture wide-field imaging - provided the overall system can accommodate a spherically curved image surface. We will present a summary of recent work on the design optimization and experimental demonstrations of monocentric wide-field imaging, including systems based on waveguide coupling of the image to conventional focal plane sensor(s).


Journal of Refractive Surgery | 2013

An Optomechanical Model Eye for Ophthalmological Refractive Studies

Ashkan Arianpour; Eric J. Tremblay; Igor Stamenov; Joseph E. Ford; David J. Schanzlin; Yu-Hwa Lo

PURPOSE To create an accurate, low-cost optomechanical model eye for investigation of refractive errors in clinical and basic research studies. METHODS An optomechanical fluid-filled eye model with dimensions consistent with the human eye was designed and fabricated. Optical simulations were performed on the optomechanical eye model, and the quantified resolution and refractive errors were compared with the widely used Navarro eye model using the ray-tracing software ZEMAX (Radiant Zemax, Redmond, WA). The resolution of the physical optomechanical eye model was then quantified with a complementary metal-oxide semiconductor imager using the image resolution software SFR Plus (Imatest, Boulder, CO). Refractive, manufacturing, and assembling errors were also assessed. A refractive intraocular lens (IOL) and a diffractive IOL were added to the optomechanical eye model for tests and analyses of a 1951 U.S. Air Force target chart. RESULTS Resolution and aberrations of the optomechanical eye model and the Navarro eye model were qualitatively similar in ZEMAX simulations. Experimental testing found that the optomechanical eye model reproduced properties pertinent to human eyes, including resolution better than 20/20 visual acuity and a decrease in resolution as the field of view increased in size. The IOLs were also integrated into the optomechanical eye model to image objects at distances of 15, 10, and 3 feet, and they indicated a resolution of 22.8 cycles per degree at 15 feet. CONCLUSIONS A life-sized optomechanical eye model with the flexibility to be patient-specific was designed and constructed. The model had the resolution of a healthy human eye and recreated normal refractive errors. This model may be useful in the evaluation of IOLs for cataract surgery.


Applied Optics | 2015

Image processing for cameras with fiber bundle image relay

Stephen J. Olivas; Ashkan Arianpour; Igor Stamenov; Rick L. Morrison; Ron A. Stack; Adam R. Johnson; Ilya Agurok; Joseph E. Ford

Some high-performance imaging systems generate a curved focal surface and so are incompatible with focal plane arrays fabricated by conventional silicon processing. One example is a monocentric lens, which forms a wide field-of-view high-resolution spherical image with a radius equal to the focal length. Optical fiber bundles have been used to couple between this focal surface and planar image sensors. However, such fiber-coupled imaging systems suffer from artifacts due to image sampling and incoherent light transfer by the fiber bundle as well as resampling by the focal plane, resulting in a fixed obscuration pattern. Here, we describe digital image processing techniques to improve image quality in a compact 126° field-of-view, 30 megapixel panoramic imager, where a 12 mm focal length F/1.35 lens made of concentric glass surfaces forms a spherical image surface, which is fiber-coupled to six discrete CMOS focal planes. We characterize the locally space-variant system impulse response at various stages: monocentric lens image formation onto the 2.5 μm pitch fiber bundle, image transfer by the fiber bundle, and sensing by a 1.75 μm pitch backside illuminated color focal plane. We demonstrate methods to mitigate moiré artifacts and local obscuration, correct for sphere to plane mapping distortion and vignetting, and stitch together the image data from discrete sensors into a single panorama. We compare processed images from the prototype to those taken with a 10× larger commercial camera with comparable field-of-view and light collection.


Applied Optics | 2013

Optimization of high-performance monocentric lenses

Igor Stamenov; Ilya Agurok; Joseph E. Ford

The recent application of monocentric lenses for panoramic high-resolution digital imagers raises the question of the achievable performance limits of this lens structure and of techniques for design optimization to approach these limits. This paper defines the important regions of the design space of moderate complexity monocentric lenses and describes systematic and global optimization algorithms for the design of monocentric objective lenses of various focal lengths, apertures, and spectral bandwidths. We demonstrate the trade-off between spectral band, F-number and lens complexity, and provide design examples of monocentric lenses for specific applications.


Proceedings of SPIE | 2014

Digital image processing for wide-angle highly spatially variant imagers

Stephen J. Olivas; Michal Šorel; Ashkan Arianpour; Igor Stamenov; Nima Nikzad; Glenn M. Schuster; Nojan Motamedi; William M. Mellette; Ron A. Stack; Adam Johnson; Rick L. Morrison; Ilya Agurok; Joseph E. Ford

High resolution, wide field-of-view and large depth-of-focus imaging systems are greatly desired and have received much attention from researchers who seek to extend the capabilities of cameras. Monocentric lenses are superior in performance over other wide field-of-view lenses with the drawback that they form a hemispheric image plane which is incompatible with current sensor technology. Fiber optic bundles can be used to relay the image the lens produces to the sensors planar surface. This requires image processing to correct for artifacts inherent to fiber bundle image transfer. Using a prototype fiber coupled monocentric lens imager we capture single exposure focal swept images from which we seek to produce extended depth-of-focus images. Point spread functions (PSF) were measured in lab and found to be both angle and depth dependent. This spatial variance enforces the requirement that the inverse problem be treated as such. This synthesis of information allowed us to establish a framework upon which to mitigate fiber bundle artifacts and extend the depth-of-focus of the imaging system.


Proceedings of SPIE | 2015

Curved fiber bundles for monocentric lens imaging

Salman Karbasi; Igor Stamenov; Nojan Motamedi; Ashkan Arianpour; Adam R. Johnson; Ron A. Stack; Chris LaReau; Ryan Tenill; Rick L. Morrison; Ilya Agurok; Joseph E. Ford

Monocentric lenses allow high resolution panoramic cameras, where imaging fiber bundles transport the hemispherical image surface to conventional focal planes. Refraction at the curved image surface limits the field of view coupled through a single bundle of straight fibers to less than ±34°, even for NA 1 fibers. Previously we have demonstrated a nearly continuous 128° field of view using a single lens and multiple adjacent straight fiber-coupled image sensors, but this imposes mechanical complexity of fiber bundle shaping and integration. However, a 3D waveguide structure with internally curved optical fiber pathways can couple the full continuous field of view onto a single focal plane. Here, we demonstrate wide-field imaging using a monocentric lens and a single curved fiber bundle, showing that the 3D bundle formed from a tapered fiber bundle can be used for relaying a 128° field of view from a curved input to the planar output face. We numerically show the coupling efficiency of light to the tapered bundle for different field of views depends on the taper ratio of the bundle as well as center of the curvature chosen for polishing of the fiber bundle facet. We characterize a tapered fiber bundle by measuring the angle dependent impulse response, transmission efficiency and the divergence angle of the light propagating from the output end of the fiber.


International Optical Design Conference | 2014

Broad-spectrum fiber-coupled monocentric lens imaging

Igor Stamenov; Stephen J. Olivas; Ashkan Arianpour; Ilya Agurok; Adam R. Johnson; Ronald A. Stack; Joseph E. Ford

Monocentric lenses provide high-resolution panoramic imaging onto a spherical image surface. We characterize the curved image transfer via fiber bundles onto CMOS focal planes and two lens prototypes - designed for visible and VNIR spectrum.


Spie Newsroom | 2016

Panoramic full-frame imaging with monocentric lenses and curved fiber bundles

Joseph E. Ford; Salman Karbasi; Ilya Agurok; Igor Stamenov; Glenn M. Schuster; Nojan Motamedi; Ash Arianpour; William M. Mellette; Adam Johnson; Ryan Tennill; Rick L. Morrison; Ron A. Stack

Panoramic imaging is important for many different applications, including content for immersive virtual reality. Although compact 360 cameras can be made from an array of small-aperture ‘smartphone’ imagers, their small (typically 1.1 m) pixels provide low dynamic range. Moreover, digital single-lens-reflex and cinematographic imagers have 4–8 m pixels, but require correspondingly longer focal length lenses. Conventional ‘fisheye’ lenses are also problematic because they are bulky and have low light collection (typically F/2.8 to F/4, where F is the focal length divided by the lens aperture). An alternative path to panoramic imaging is ‘monocentric’ optics, where all surfaces—including the image surface—are concentric hemispheres.1 The symmetry of these lenses means that lateral color and off-axis aberrations (astigmatism and coma) are eliminated. In addition, the simple lens structures can be used to correct for spherical and axial color aberrations to yield extraordinarily wide angle resolution and light collection.2 The image that is produced can be coupled to a conventional focal plane, via a fiber bundle faceplate (with a curved input and flat output face).3 Fiber faceplates are solid glass elements made of small, high-index optical fibers separated by a thin, low-index cladding, used for nonimaging transfer of light between the input and output faces. From our research, within the Defense Advanced Research Projects Agency (DARPA) SCENICC (Soldier Centric Imaging via Computational Cameras) program, we have shown that fiber bundles can reach a spatial resolution of 2 m.4 We have also Figure 1. Geometry of a monocentric lens (left) and the spherical image surface it forms (right) can be coupled to CMOS focal plane(s) by an array of straight fiber bundles (top) or a single curved fiber bundle (bottom). The F-number is the focal length (f) divided by the lens aperture.

Collaboration


Dive into the Igor Stamenov's collaboration.

Top Co-Authors

Avatar

Joseph E. Ford

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ilya Agurok

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nojan Motamedi

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Salman Karbasi

University of California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge