Daniel Matolin
Austrian Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Daniel Matolin.
IEEE Journal of Solid-state Circuits | 2011
Christoph Posch; Daniel Matolin; Rainer Wohlgenannt
The biomimetic CMOS dynamic vision and image sensor described in this paper is based on a QVGA (304×240) array of fully autonomous pixels containing event-based change detection and pulse-width-modulation (PWM) imaging circuitry. Exposure measurements are initiated and carried out locally by the individual pixel that has detected a change of brightness in its field-of-view. Pixels do not rely on external timing signals and independently and asynchronously request access to an (asynchronous arbitrated) output channel when they have new grayscale values to communicate. Pixels that are not stimulated visually do not produce output. The visual information acquired from the scene, temporal contrast and grayscale data, are communicated in the form of asynchronous address-events (AER), with the grayscale values being encoded in inter-event intervals. The pixel-autonomous and massively parallel operation ideally results in lossless video compression through complete temporal redundancy suppression at the pixel level. Compression factors depend on scene activity and peak at ~1000 for static scenes. Due to the time-based encoding of the illumination information, very high dynamic range - intra-scene DR of 143 dB static and 125 dB at 30 fps equivalent temporal resolution - is achieved. A novel time-domain correlated double sampling (TCDS) method yields array FPN of <;0.25% rms. SNR is >56 dB (9.3 bit) for >10 Lx illuminance.
international symposium on circuits and systems | 2008
Christoph Posch; Daniel Matolin; Rainer Wohlgenannt
In this paper we propose a fully asynchronous, time- based image sensor, which is characterized by high temporal resolution, low data rate (near complete temporal redundancy suppression), high dynamic range, and low power consumption. Autonomous pixels asynchronously communicate the detection of relative changes in light intensity, and the time from change detection to the threshold crossing of a photocurrent integrator, so encoding the instantaneous pixel illumination shortly after the time of a detected change. The chip is being implemented in a standard 0.18 mum CMOS process and measures less than 10times8 mm2 at 304times240 pixel resolution.
international solid-state circuits conference | 2010
Christoph Posch; Daniel Matolin; Rainer Wohlgenannt
Conventional image/video sensors acquire visual information from a scene in time-quantized fashion at some predetermined frame rate. Each frame carries the information from all pixels, regardless of whether or not this information has changed since the last frame had been acquired, which is usually not long ago. This method obviously leads, depending on the dynamic contents of the scene, to a more or less high degree of redundancy in the image data. Acquisition and handling of these dispensable data consume valuable resources; sophisticated and resource-hungry video compression methods have been developed to deal with these data.
IEEE Transactions on Biomedical Circuits and Systems | 2011
Denis Guangyin Chen; Daniel Matolin; Amine Bermak; Christoph Posch
In time-domain or pulse-modulation (PM) imaging, the incident light intensity is not encoded in amounts of charge, voltage, or current as it is in conventional image sensors. Instead, the image data are represented by the timing of pulses or pulse edges. This method of visual information encoding optimizes the phototransduction individually for each pixel by abstaining from imposing a fixed integration time for the entire array. Exceptionally high dynamic range (DR) and improved signal-to-noise ratio (SNR) are immediate benefits of this approach. In particular, DR is no longer limited by the power-supply rails as in conventional complementary metal-oxide semiconductor (CMOS) complementary metal-oxide semiconductor active pixel sensors, thus providing relative immunity to the supply-voltage scaling of modern CMOS technologies. In addition, PM imaging naturally supports pixel-parallel analog-to-digital conversion, thereby enabling high temporal resolution/frame rates or an asynchronous event-based array readout. The applications of PM imaging in emerging areas, such as sensor network, wireless endoscopy, retinal prosthesis, polarization imaging, and energy harvesting are surveyed to demonstrate the effectiveness of PM imaging in low-power, high-performance machine vision, and biomedical applications of the future. The evolving design innovations made in PM imaging, such as high-speed arbitration circuits and ultra-compact processing elements, are expected to have even wider impacts in disciplines beyond CMOS image sensors. This paper thoroughly reviews and classifies all common PM image sensor architectures. Analytical models and a universal figure of merit - image quality and dynamic range to energy complexity factor are proposed to quantitatively assess different PM imagers across the entire spectrum of PM architectures.
international solid-state circuits conference | 2007
Christoph Posch; Michael Hofstätter; Daniel Matolin; Guy Vanstraelen; Peter Schön; Nikolaus Donath; Martin Litzenberger
A 120dB dynamic range 2times256 dual-line optical transient sensor uses pixels that respond asynchronously to relative intensity changes. A time stamp with variable resolution down to 100ns is allocated to the events at the pixel level. The pixel address and time stamp are read out via a 3-stage pipelined synchronous arbiter. The chip is fabricated in 0.35mum CMOS, runs at 40MHz and consumes 250mW at 3.3V
international symposium on circuits and systems | 2009
Daniel Matolin; Christoph Posch; Rainer Wohlgenannt
This paper presents a time-domain correlated double sampling (CDS) method for time-based/PWM image sensors. The concept has been realized in the pixel circuit design for a spiking asynchronous, time-based image sensor (ATIS). The pixel circuitry includes a two-stage voltage comparator with tunable hysteresis and dynamic current control, and pixel-level state logic. The sensor, based on a 240×304 pixel array, was implemented in a standard 0.18µm CMOS process. We present measurements from the fabricated chip and compare them to results from theoretical considerations. Implications of the proposed CDS method on the comparator design in terms of chip area and power consumption are discussed and quantified.
international symposium on circuits and systems | 2010
Christoph Posch; Daniel Matolin; Rainer Wohlgenannt
The presented asynchronous, time-based CMOS dynamic vision and image sensor is based on a QVGA (304×240) array of fully autonomous pixels containing event-based change detection and PWM imaging circuitry. Exposure measurements are initiated and carried out locally by the individual pixel that has detected a brightness change in its field-of-view. Thus pixels do not rely on external timing signals and independently and asynchronously request access to an (asynchronous arbitrated) output channel when they have new illumination values to communicate. Communication is address-event based (AER)-gray-levels are encoded in inter-event intervals. Pixels that are not stimulated visually do not produce output. This pixel-autonomous and massively parallel operation ideally results in optimal lossless video compression through complete temporal redundancy suppression at the focal-plane. Compression factors depend on scene activity. Due to the time-based encoding of the illumination information, very high dynamic range — intra-scene DR of 143dB static and 125dB at 30fps equivalent temporal resolution — is achieved. A novel time-domain correlated double sampling (TCDS) method yields array FPN of <0.25%. SNR is >56dB (9.3bit) for >10Lx.
IEEE Sensors Journal | 2009
Christoph Posch; Daniel Matolin; Rainer Wohlgenannt; Thomas Maier; Martin Litzenberger
In this paper, a novel event-based dynamic IR vision sensor is presented. The device combines an uncooled microbolometer array with biology-inspired (ldquoneuromorphicrdquo) readout circuitry to implement an asynchronous, ldquospikingrdquo vision sensor for the 8-15 mum thermal infrared spectral range. The sensors autonomous pixels independently respond to changes in thermal IR radiation and communicate detected variations in the form of asynchronous ldquoaddress-events.rdquo The 64times64 pixel ROIC chip has been fabricated in a 0.35 mum 2P4M standard CMOS process, covers about 4times4 mm2 of silicon area and consumes 8 mW of power. An amorphous silicon (a-Si) microbolometer array has been processed on top of the ROIC and contacted to the pixel circuits. We discuss the bolometer detector properties, describe the pixel circuits and the implemented sensor architecture, and show measurement results of the readout circuits. Subsequently, a DFT-based approach to the characterization of asynchronous, spiking sensor arrays is discussed and applied. Test results and analysis of sensitivity, bandwidth, and noise of the fabricated IR sensor prototype are presented.
international symposium on circuits and systems | 2007
Christoph Posch; Michael Hofstätter; Martin Litzenberger; Daniel Matolin; Nikolaus Donath; Peter Schön; Heinrich Garn
This paper presents a 2times256 pixel dual-line temporal contrast vision sensor and the use of this sensor in exemplary high-speed machine vision applications over a wide range of target illumination. The sensor combines an asynchronous, data-driven pixel circuit with an on-chip precision time-stamp generator and a 3-stage pipelined synchronous bus-arbiter. With a temporal resolution of down to 100ns, corresponding to a line rate of 10MHz, the sensor is ideal for high-speed machine vision tasks that do not rely on conventional image data. The output data rate depends on the dynamic contents of the target scene and is typically orders of magnitude lower than equivalent data output produced by conventional clocked line sensors in this type of applications. 120dB dynamic range makes high-speed operation possible at low lighting levels or uncontrolled lighting conditions. The sensor features two parallel pixel lines with a line separation of 250mum and a pixel pitch of 15mum. A prototype was fabricated in a standard 0.35mum CMOS technology. Results on high-speed edge angle resolution and edge gradient extraction as well as wide dynamic range operation are presented.
international symposium on circuits and systems | 2010
Daniel Matolin; Rainer Wohlgenannt; Martin Litzenberger; Christoph Posch
This paper describes concept and implementation of an asynchronous, column-parallel readout method for event-based pulse-width-modulation (PWM) image sensors. These time-based imaging devices transmit exposure information in the form of asynchronous spike-events (AER) via an arbitrated asynchronous data bus that is common to all pixels. Event-collisions on the bus distort the time information and lead to errors in the instantaneous illumination measurement of the concerned pixel. The effect becomes manifest when imaging uniform, homogenous (parts of) scenes. One method to balance the load on the communication channel is to avoid a global reset signal. The proposed concept spreads the reset times according to the varying local illumination in the array after a global starting point, in fact realizing an asynchronous, column-parallel, light-dependent “rolling-shutter” mode. This novel concept has been realized in the design of a spiking asynchronous, time-based QVGA image sensor. We present theoretical considerations and preliminary measurement results from the chip fabricated in a standard 0.18μm CMOS process.