Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nikolaus Donath is active.

Publication


Featured researches published by Nikolaus Donath.


international solid-state circuits conference | 2007

A Dual-Line Optical Transient Sensor with On-Chip Precision Time-Stamp Generation

Christoph Posch; Michael Hofstätter; Daniel Matolin; Guy Vanstraelen; Peter Schön; Nikolaus Donath; Martin Litzenberger

A 120dB dynamic range 2times256 dual-line optical transient sensor uses pixels that respond asynchronously to relative intensity changes. A time stamp with variable resolution down to 100ns is allocated to the events at the pixel level. The pixel address and time stamp are read out via a 3-stage pipelined synchronous arbiter. The chip is fabricated in 0.35mum CMOS, runs at 40MHz and consumes 250mW at 3.3V


Eurasip Journal on Embedded Systems | 2007

Embedded vehicle speed estimation system using an asynchronous temporal contrast vision sensor

Daniel Bauer; Ahmed Nabil Belbachir; Nikolaus Donath; Gerhard Gritsch; Bernhard Kohn; Martin Litzenberger; Christoph Posch; Peter Schön; Stephan Schraml

This article presents an embedded multilane traffic data acquisition system based on an asynchronous temporal contrast vision sensor, and algorithms for vehicle speed estimation developed to make efficient use of the asynchronous high-precision timing information delivered by this sensor. The vision sensor features high temporal resolution with a latency of less than 100 μ s, wide dynamic range of 120 dB of illumination, and zero-redundancy, asynchronous data output. For data collection, processing and interfacing, a low-cost digital signal processor is used. The speed of the detected vehicles is calculated from the vision sensors asynchronous temporal contrast event data. We present three different algorithms for velocity estimation and evaluate their accuracy by means of calibrated reference measurements. The error of the speed estimation of all algorithms is near zero mean and has a standard deviation better than 3% for both traffic flow directions. The results and the accuracy limitations as well as the combined use of the algorithms in the system are discussed.


international conference on intelligent transportation systems | 2007

Vehicle Counting with an Embedded Traffic Data System using an Optical Transient Sensor

Martin Litzenberger; Bernhard Kohn; Gerhard Gritsch; Nikolaus Donath; C. Posch; N.A. Belbachir; H. Garn

In this paper a sensor system for traffic data acquisition is presented. The embedded system, comprising a motion-sensitive optical sensor and a low-cost, low-power DSP, is capable of detecting, counting and measuring the velocity of passing vehicles. The detection is based on monitoring of the optical sensor output within configurable regions of interest in the sensors field-of-view. In particular in this work we focus on the evaluation of the applied vehicle counting algorithm. The verification of the acquired data is based on manually annotated traffic data of 360 minutes length, containing a total of about 7000 vehicles. The counting error is determined for short (3 minutes) and long (60 minutes) time intervals. The calculated error of 99,2% of the short time intervals and 100% of the long time intervals analyzed, remain within commonly recognized margins of 10% and 3% of detection error respectively.


international conference on intelligent transportation systems | 2009

Night-time vehicle classification with an embedded, vision system

Gerhard Gritsch; Nikolaus Donath; Bernhard Kohn; Martin Litzenberger

The paper presents night-time vehicle classification using an embedded vision system based on an optical transient sensor. This neuromorphic sensor features an array of 128×128 pixels that respond to relative light intensity changes with low latency and high dynamic range. The proposed algorithm exploits the temporal resolution and sparse representation of the data, delivered by the sensor in the data-driven Address-Event Representation (AER) format, to efficiently implement a robust classification of vehicles into two classes, car-like and truck-like, during night-time operation. The classification is based on the extraction of the positions and distances of the vehicles head lights to estimate vehicle width. We present the algorithm, test data and an evaluation of the classification accuracy by comparison of the test data with ground truth from video annotation and reference results from a state-of-the-art ultrasonic/radar-combination reference detector. The results show that the difference in total truck counts with respect to a reference detector and to manually annotated video during nighttime operation under dry and wet road conditions is typically below 6%.


international symposium on circuits and systems | 2007

Wide dynamic range, high-speed machine vision with a 2×256 pixel temporal contrast vision sensor

Christoph Posch; Michael Hofstätter; Martin Litzenberger; Daniel Matolin; Nikolaus Donath; Peter Schön; Heinrich Garn

This paper presents a 2times256 pixel dual-line temporal contrast vision sensor and the use of this sensor in exemplary high-speed machine vision applications over a wide range of target illumination. The sensor combines an asynchronous, data-driven pixel circuit with an on-chip precision time-stamp generator and a 3-stage pipelined synchronous bus-arbiter. With a temporal resolution of down to 100ns, corresponding to a line rate of 10MHz, the sensor is ideal for high-speed machine vision tasks that do not rely on conventional image data. The output data rate depends on the dynamic contents of the target scene and is typically orders of magnitude lower than equivalent data output produced by conventional clocked line sensors in this type of applications. 120dB dynamic range makes high-speed operation possible at low lighting levels or uncontrolled lighting conditions. The sensor features two parallel pixel lines with a line separation of 250mum and a pixel pitch of 15mum. A prototype was fabricated in a standard 0.35mum CMOS technology. Results on high-speed edge angle resolution and edge gradient extraction as well as wide dynamic range operation are presented.


Elektrotechnik Und Informationstechnik | 2007

Ein innovatives, optisches Sensorsystem für die Verkehrsdatenerfassung.

Martin Litzenberger; Bernhard Kohn; Gerhard Gritsch; Nikolaus Donath; Heinrich Garn

ZusammenfassungDer Artikel beschreibt ein eingebettetes, optisches Sensorsystem, das auf einem speziellen optischen CMOS-Sensor mit analoger Signalvorverarbeitung basiert. Die Daten werden asynchron mit dem so genannten Address-event-Protokoll vom Sensor zur nachgeschalteten digitalen Signalverarbeitung übertragen. Dank der effizienten Datenkodierung und Signalvorverarbeitung, die bewegte Objekte aus der Szene extrahiert, erreicht die Zeitauflösung der digitalen Datenverarbeitung auf einem kostengünstigen digitalen Signalprozessor eine Millisekunde. Die Algorithmen zur Verkehrsdatenerfassung berechnen in Echtzeit die Verkehrsstärke, Fahrgeschwindigkeit, Nettozeitlücke und Belegung auf bis zu vier Fahrstreifen simultan. Der Artikel präsentiert Verkehrsdaten, die an einer vierspurigen Autobahn über vier Tage aufgenommen wurden. Die Zeitverläufe der Verkehrsstärke, der mittleren Fahrgeschwindigkeit sowie der mittleren Nettozeitlücke und der mittleren Belegung wurden in Fünfminutenintervallen ausgewertet. Der Fehler in der Geschwindigkeitsschätzung liegt unter 3 %, die Genauigkeit der Fahrzeugzählung liegt, bei Tageslicht und Normalbedingungen, über 97 %.SummaryAn embedded vision system based on a specialized CMOS optical sensor with on-chip analogue signal pre-processing is described. The sensor signal is transmitted to digital processing units via asynchronous address-event protocol. Using the efficient data coding and signal pre-processing concept, moving objects can be extracted from the visual scene with a temporal resolution of 1 millisecond. The traffic data acquisition algorithms compute traffic flow, vehicle velocity, separation and lane occupancy in real-time on up to four lanes simultaneously. We present traffic data that has been acquired continuously over four days at a test site at a four lane highway. The error of the velocity estimation is below 3 %, the precision of the vehicle flow measurement is better than 97 % under normal daylight conditions.


international conference on intelligent transportation systems | 2006

Estimation of Vehicle Speed Based on Asynchronous Data from a Silicon Retina Optical Sensor

Martin Litzenberger; A.N. Belbachir; Nikolaus Donath; Gerhard Gritsch; H. Garn; Bernhard Kohn; C. Posch; S. Schraml


Archive | 2006

Method and Image Evaluation Unit for Scene Analysis

Martin Litzenberger; Bernhard Kohn; Peter Schön; Michael Hofstätter; Nikolaus Donath; Christoph Posch; Nenad Milosevic


international conference on intelligent transportation systems | 2008

Real-Time Vehicle Classification using a Smart Embedded Device with a `Silicon Retina' Optical Sensor

Gerhard Gritsch; Martin Litzenberger; Nikolaus Donath; Bernhard Kohn


Archive | 2006

Method for evaluating scenes and sensor or recording element

Daniel Bauer; Nikolaus Donath; Michael Hofstätter; Bernhard Kohn; Martin Litzenberger; Josef Meser; Nenad Milosevic; Christoph Posch; Peter Schön

Collaboration


Dive into the Nikolaus Donath's collaboration.

Top Co-Authors

Avatar

Martin Litzenberger

Austrian Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Bernhard Kohn

Austrian Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Christoph Posch

Austrian Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Peter Schön

Austrian Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Gerhard Gritsch

Austrian Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Michael Hofstätter

Austrian Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Nenad Milosevic

Austrian Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Daniel Bauer

Austrian Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Daniel Matolin

Austrian Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Heinrich Garn

Austrian Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge