Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sriram Sivaramakrishnan is active.

Publication


Featured researches published by Sriram Sivaramakrishnan.


custom integrated circuits conference | 2012

A 180nm CMOS image sensor with on-chip optoelectronic image compression

Albert Wang; Sriram Sivaramakrishnan; Alyosha Molnar

We demonstrate an image sensor which directly acquires images in a compressed format. The sensor uses diffractive optical structures integrated in the CMOS back-end layer stack to compute a low-order 2D spatial Gabor transform on visual scenes. As this computation occurs in the optical domain, the readout back-end uses the transform outputs to implement the subsequent image digitization and compression simultaneously. Implemented in a 180nm logic CMOS process, the image sensor uses a 384×384 array of complementary angle-sensitive pixel pairs, consumes 2mW at a frame rate of 15fps, and achieves a 10:1 compression ratio on test images.


computer vision and pattern recognition | 2016

ASP Vision: Optically Computing the First Layer of Convolutional Neural Networks Using Angle Sensitive Pixels

Huaijin Chen; Suren Jayasuriya; Jiyue Yang; Judy Stephen; Sriram Sivaramakrishnan; Ashok Veeraraghavan; Alyosha Molnar

Deep learning using convolutional neural networks (CNNs) is quickly becoming the state-of-the-art for challenging computer vision applications. However, deep learnings power consumption and bandwidth requirements currently limit its application in embedded and mobile systems with tight energy budgets. In this paper, we explore the energy savings of optically computing the first layer of CNNs. To do so, we utilize bio-inspired Angle Sensitive Pixels (ASPs), custom CMOS diffractive image sensors which act similar to Gabor filter banks in the V1 layer of the human visual cortex. ASPs replace both image sensing and the first layer of a conventional CNN by directly performing optical edge filtering, saving sensing energy, data bandwidth, and CNN FLOPS to compute. Our experimental results (both on synthetic data and a hardware prototype) for a variety of vision tasks such as digit recognition, object recognition, and face identification demonstrate 97% reduction in image sensor power consumption and 90% reduction in data bandwidth from sensor to CPU, while achieving similar performance compared to traditional deep learning pipelines.


international conference on computational photography | 2014

A switchable light field camera architecture with Angle Sensitive Pixels and dictionary-based sparse coding

Matthew Hirsch; Sriram Sivaramakrishnan; Suren Jayasuriya; Albert Wang; Alyosha Molnar; Ramesh Raskar; Gordon Wetzstein

We propose a flexible light field camera architecture that is at the convergence of optics, sensor electronics, and applied mathematics. Through the co-design of a sensor that comprises tailored, Angle Sensitive Pixels and advanced reconstruction algorithms, we show that-contrary to light field cameras today-our system can use the same measurements captured in a single sensor image to recover either a high-resolution 2D image, a low-resolution 4D light field using fast, linear processing, or a high-resolution light field using sparsity-constrained optimization.


Journal of Instrumentation | 2012

Robustness of planar Fourier capture arrays to colour changes and lost pixels

Patrick R. Gill; Changhyuk Lee; Sriram Sivaramakrishnan; Alyosha Molnar

Planar Fourier capture arrays (PFCAs) are optical sensors built entirely in standard microchip manufacturing flows. PFCAs are composed of ensembles of angle sensitive pixels (ASPs) that each report a single coefficient of the Fourier transform of the far-away scene. Here we characterize the performance of PFCAs under the following three non-optimal conditions. First, we show that PFCAs can operate while sensing light of a wavelength other than the design point. Second, if only a randomly-selected subset of 10% of the ASPs are functional, we can nonetheless reconstruct the entire far-away scene using compressed sensing. Third, if the wavelength of the imaged light is unknown, it can be inferred by demanding self-consistency of the outputs.


international conference on 3d vision | 2015

Depth Fields: Extending Light Field Techniques to Time-of-Flight Imaging

Suren Jayasuriya; Adithya Kumar Pediredla; Sriram Sivaramakrishnan; Alyosha Molnar; Ashok Veeraraghavan

A variety of techniques such as light field, structured illumination, and time-of-flight (TOF) are commonly used for depth acquisition in consumer imaging, robotics and many other applications. Unfortunately, each technique suffers from its individual limitations preventing robust depth sensing. In this paper, we explore the strengths and weaknesses of combining light field and time-of-flight imaging, particularly the feasibility of an on-chip implementation as a single hybrid depth sensor. We refer to this combination as depth field imaging. Depth fields combine light field advantages such as synthetic aperture refocusing with TOF imaging advantages such as high depth resolution and coded signal processing to resolve multipath interference. We show applications including synthesizing virtual apertures for TOF imaging, improved depth mapping through partial and scattering occluders, and single frequency TOF phase unwrapping. Utilizing space, angle, and temporal coding, depth fields can improve depth sensing in the wild and generate new insights into the dimensions of lights plenoptic function.


international electron devices meeting | 2011

Enhanced angle sensitive pixels for light field imaging

Sriram Sivaramakrishnan; Albert Wang; Patrick R. Gill; Alyosha Molnar

Previously demonstrated angle sensitive pixels (ASPs) have been shown to enable integrated digital light-field imaging in CMOS, but suffer from significantly reduced pixel quantum efficiency and increased sensor size. This work demonstrates ASP devices that use phase gratings and a pair of interleaved diodes to double pixel density and improve quantum efficiency by a factor of 4.


IEEE Transactions on Electron Devices | 2016

Design and Characterization of Enhanced Angle Sensitive Pixels

Sriram Sivaramakrishnan; Albert Wang; Patrick R. Gill; Alyosha Molnar

Stacks of metal gratings placed above a photodiode have been demonstrated to show a strong sinusoidal response to angle. Although these devices, called angle sensitive pixels (ASPs), enable new imaging modalities on CMOS, such as light field capture, their optical sensitivity is limited by the metallic gratings. In this paper, we provide a detailed characterization of the angular response of ASPs and report on a new set of optically efficient structures for angle detection. By analyzing the properties of the Talbot effect, we present a quantitative model for the angular sensitivity, β, of an ASP and qualitatively describe the dependence of modulation depth, m, on the grating parameters. We then describe structures that employ phase gratings and interleaved diffusion diodes to provide angle sensitivity while improving the quantum efficiency by 4×. A post-CMOS process flow for the fabrication of phase gratings is described. Finally, we show that ASPs that use p+ diffusion diodes embedded in an n-well diode improve their modulation efficiency by over 50% compared with interleaved diodes that share the p-substrate.


Optics Letters | 2015

Dual light field and polarization imaging using CMOS diffractive image sensors

Suren Jayasuriya; Sriram Sivaramakrishnan; Ellen Chuang; Debashree Guruaribam; Albert Wang; Alyosha Molnar

In this Letter we present, to the best of our knowledge, the first integrated CMOS image sensor that can simultaneously perform light field and polarization imaging without the use of external filters or additional optical elements. Previous work has shown how photodetectors with two stacks of integrated metal gratings above them (called angle sensitive pixels) diffract light in a Talbot pattern to capture four-dimensional light fields. We show, in addition to diffractive imaging, that these gratings polarize incoming light and characterize the response of these sensors to polarization and incidence angle. Finally, we show two applications of polarization imaging: imaging stress-induced birefringence and identifying specular reflections in scenes to improve light field algorithms for these scenes.


IEEE Sensors Journal | 2016

A Polar Symmetric CMOS Image Sensor for Rotation Invariant Measurement

Sriram Sivaramakrishnan; Changhyuk Lee; Ben Johnson; Alyosha Molnar

We present a CMOS image sensor for efficient capture of polar symmetric imaging targets. The array uses circular photodiodes, arranged in concentric rings to capture, for example, diffraction patterns generated by optically probing a revolving MEMS device. The chip is designed with a vacant, central spot to facilitate the easy single-axis alignment of the probing illumination, target device, and detector. Imaging of high-speed rotation (>1 kfps) is made possible by dividing the array into multiple concentric bands with sectorwise addressing control. We introduce a global shutter pixel reset scheme that reduces fixed pattern noise by being insensitive to parasitic capacitance from variable routing. We demonstrate the sensors capability to measure the rotation angle with a precision of 32 μrad and the rotation rates up to 300 rpm. Finally, we demonstrate the concept of a compact optical metrology system for continuous inertial sensor calibration by imaging the diffraction pattern created by a commercial MEMS accelerometer probed by a red laser shone through the axis of symmetry of the image sensor.


ieee sensors | 2013

A high-speed polar-symmetric imager for real-time calibration of rotational inertial sensors

Ben Johnson; Changhyuk Lee; Sriram Sivaramakrishnan; Alyosha Molnar

We present a high-speed (> 1kfps), circular, CMOS imaging array for contact-less, optical measurement of rotating inertial sensors. The imager is designed for real-time optical readout and calibration of a MEMS accelerometer revolving at greater than 1000rpm. The imager uses a uniform circular arrangement of pixels to enable rapid imaging of rotational objects. Furthermore, each photodiode itself is circular to maintain uniform response throughout the entire revolution. Combining a high frame rate and a uniform response to motion, the imager can achieve sub-pixel resolution (25nm) of the displacement of microscale features. In order to avoid fixed pattern noise arising from non-uniform routing within the array we implemented a new global shutter technique that is insensitive to parasitic capacitance. To ease integration with various MEMS platforms, the system has SPI control, on-chip bias generation, sub-array imaging, and digital data read-out.

Collaboration


Dive into the Sriram Sivaramakrishnan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge