Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Thiago Santini is active.

Publication


Featured researches published by Thiago Santini.


arXiv: Computer Vision and Pattern Recognition | 2016

ElSe: ellipse selection for robust pupil detection in real-world environments

Thiago Santini; Thomas C. Kübler; Enkelejda Kasneci

Fast and robust pupil detection is an essential prerequisite for video-based eye-tracking in real-world settings. Several algorithms for image-based pupil detection have been proposed in the past, their applicability, however, is mostly limited to laboratory conditions. In real-world scenarios, automated pupil detection has to face various challenges, such as illumination changes, reflections (on glasses), make-up, non-centered eye recording, and physiological eye characteristics. We propose ElSe, a novel algorithm based on ellipse evaluation of a filtered edge image. We aim at a robust, inexpensive approach that can be integrated in embedded architectures, e.g., driving. The proposed algorithm was evaluated against four state-of-the-art methods on over 93,000 hand-labeled images from which 55,000 are new eye images contributed by this work. On average, the proposed method achieved a 14.53% improvement on the detection rate relative to the best state-of-the-art performer. Algorithm and data sets are available for download: ftp://[email protected] (password:eyedata).


arXiv: Computer Vision and Pattern Recognition | 2016

Bayesian identification of fixations, saccades, and smooth pursuits

Thiago Santini; Thomas C. Kübler; Enkelejda Kasneci

Smooth pursuit eye movements provide meaningful insights and information on subjects behavior and health and may, in particular situations, disturb the performance of typical fixation/saccade classification algorithms. Thus, an automatic and efficient algorithm to identify these eye movements is paramount for eye-tracking research involving dynamic stimuli. In this paper, we propose the Bayesian Decision Theory Identification (I-BDT) algorithm, a novel algorithm for ternary classification of eye movements that is able to reliably separate fixations, saccades, and smooth pursuits in an online fashion, even for low-resolution eye trackers. The proposed algorithm is evaluated on four datasets with distinct mixtures of eye movements, including fixations, saccades, as well as straight and circular smooth pursuits; data was collected with a sample rate of 30 Hz from six subjects, totaling 24 evaluation datasets. The algorithm exhibits high and consistent performance across all datasets and movements relative to a manual annotation by a domain expert (recall: μ = 91.42%, σ = 9.52%; precision: μ = 95.60%, σ = 5.29%; specificity μ = 95.41%, σ = 7.02%) and displays a significant improvement when compared to I-VDT, an state-of-the-art algorithm (recall: μ = 87.67%, σ = 14.73%; precision: μ = 89.57%, σ = 8.05%; specificity μ = 92.10%, σ = 11.21%). Algorithm implementation and annotated datasets are openly available at www.ti.uni-tuebingen.de/perception


IEEE Transactions on Computers | 2016

Evaluation and Mitigation of Radiation-Induced Soft Errors in Graphics Processing Units

Daniel Oliveira; Laércio Lima Pilla; Thiago Santini; Paolo Rech

Graphics processing units (GPUs) are increasingly attractive for both safety-critical and High-Performance Computing applications. GPU reliability is a primary concern for both the automotive and aerospace markets and is becoming an issue also for supercomputers. In fact, the high number of devices in large data centers makes the probability of having at least a device corrupted to be very high. In this paper, we aim at giving novel insights on GPU reliability by evaluating the neutron sensitivity of modern GPUs memory structures, highlighting pattern dependence and multiple errors occurrences. Additionally, a wide set of parallel codes are exposed to controlled neutron beams to measure GPUs operative error rates. From experimental data and algorithm analysis we derive general insights on parallel algorithms and programming approaches reliability. Finally, error-correcting code, algorithm-based fault tolerance, and duplication with comparison hardening strategies are presented and evaluated on GPUs through radiation experiments. We present and compare both the reliability improvement and imposed overhead of the selected hardening solutions.


human factors in computing systems | 2017

CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction

Thiago Santini; Enkelejda Kasneci

As devices around us become smart, our gaze is poised to become the next frontier of human-computer interaction (HCI). State-of-the-art mobile eye tracker systems typically rely on eye-model-based gaze estimation approaches, which do not require a calibration. However, such approaches require specialized hardware (e.g., multiple cameras and glint points), can be significantly affected by glasses, and, thus, are not fit for ubiquitous gaze-based HCI. In contrast, regression-based gaze estimations are straightforward approaches requiring solely one eye and one scene camera but necessitate a calibration. Therefore, a fast and accurate calibration is a key development to enable ubiquitous gaze-based HCI. In this paper, we introduce CalibMe, a novel method that exploits collection markers (automatically detected fiducial markers) to allow eye tracker users to gather a large array of calibration points, remove outliers, and automatically reserve evaluation points in a fast and unsupervised manner. The proposed approach is evaluated against a nine-point calibration method, which is typically used due to its relatively short calibration time and adequate accuracy. CalibMe reached a mean angular error of 0.59 (0=0.23) in contrast to 0.82 (0=0.15) for a nine-point calibration, attesting for the efficacy of the method. Moreover, users are able to calibrate the eye tracker anywhere and independently in - 10 s using a cellphone to display the collection marker.


european test symposium | 2014

Reducing embedded software radiation-induced failures through cache memories

Thiago Santini; Paolo Rech; Gabriel L. Nazar; Luigi Carro; Flávio Rech Wagner

Cache memories are traditionally disabled in space-level and safety-critical applications, since it was believed that the sensitive area they introduce would compromise the system reliability. As technology has evolved, the speed gap between logic and main memory has increased in such a way that disabling caches slows the code much more than in the past. As a result, the processor is exposed for a much longer time in order to compute the same workload. In this paper we demonstrate that, on modern embedded processors, enabling caches may bring benefits to critical systems: the larger exposed area may be compensated by the shorter exposure time, leading to an overall improved reliability. We describe the Mean Workload Between Failures, an intuitive metric to evaluate the impact of enabling caches for a given generic application error rate. The proposed metric is experimentally validated through an extensive radiation test campaign using a 28 nm off-the-shelf ARM-based SoC as a case study. The failure probability of the bare-metal application is decreased when the L1 cache is enabled but increased when L2 is also enabled. We also discuss when L2 caches could make the device more reliable.


international conference on computer vision theory and applications | 2017

EyeRecToo: Open-source Software for Real-time Pervasive Head-mounted Eye Tracking.

Thiago Santini; David Geisler; Enkelejda Kasneci

Head-mounted eye tracking offers remarkable opportunities for research and applications regarding pervasive health monitoring, mental state inference, and human computer interaction in dynamic scenarios. Although a plethora of software for the acquisition of eye-tracking data exists, they often exhibit critical issues when pervasive eye tracking is considered, e.g., closed source, costly eye tracker hardware dependencies, and requiring a human supervisor for calibration. In this paper, we introduce EyeRecToo, an open-source software for real-time pervasive head-mounted eye-tracking. Out of the box, EyeRecToo offers multiple real-time state-ofthe-art pupil detection and gaze estimation methods, which can be easily replaced by user implemented algorithms if desired. A novel calibration method that allows users to calibrate the system without the assistance of a human supervisor is also integrated. Moreover, this software supports multiple head-mounted eye-tracking hardware, records eye and scene videos, and stores pupil and gaze information, which are also available as a real-time stream. Thus, EyeRecToo serves as a framework to quickly enable pervasive eye-tracking research and applications. Available at: www.ti.uni-tuebingen.de/perception.


ubiquitous computing | 2016

Eyes wide open? eyelid location and eye aperture estimation for pervasive eye tracking in real-world scenarios

Thiago Santini; David Geisler; Thomas C. Kübler; Wolfgang Rosenstiel; Enkelejda Kasneci

Eyelid identification and aperture estimation provide key data that can be used to infer valuable information about a subjects mental state (e.g., vigilance, fatigue, and drowsiness) as well as validate or reduce the search space of other eye features. In this paper, we consider these tasks from the perspective of pervasive eye tracking, taking into account the multiple challenges and constraints that arise from this scenario. A novel method for eyelid identification and aperture estimation is proposed and evaluated against challenging data from an eye-tracking experiment conducted in driving scenarios in the wild. The proposed method outperforms an state-of-the-art approach by up to 40 percentage points and runs in real-time on state-of-the-art eye tracking systems. The method implementation and the realistic dataset are provided openly at www.ti.uni-tuebingen.de/perception.


ubiquitous computing | 2016

Evaluation of state-of-the-art pupil detection algorithms on remote eye images

David Geisler; Thiago Santini; Wolfgang Rosenstiel; Enkelejda Kasneci

Eye movements are a powerful source of information as well as the most intuitive form of interaction. Although eye-tracking technology is still in its infancy, it offers the greatest potential for novel communication solutions and applications. Whereas head-mounted eye-trackers are widely used in research, several applications require most unintrusive eye tracking, ideally realized by means of a single, low-cost camera placed away from the subject. However, such remote devices usually provide low resolution images and pose several challenges to gaze position estimation. The key challenge in such a scenario is the robust detection of the pupil center in the recorded image. We evaluated eight state-of-the-art algorithms for pupil detection on three manually labeled data sets recorded in remote tracking scenarios. Among the evaluated algorithms, ElSe [6] proved to be the best performing approach on overall 3202 images from remote eye tracking, which include changing illumination, occlusion, head movements, and off-axial camera position. In addition, we contribute a new data set with 445 annotated images, recorded in a fixed setup with a low cost camera capable of using natural and infrared light.


Computer Vision and Image Understanding | 2018

PuRe: Robust pupil detection for real-time pervasive eye tracking

Thiago Santini; Enkelejda Kasneci

Abstract Real-time, accurate, and robust pupil detection is an essential prerequisite to enable pervasive eye-tracking and its applications – e.g., gaze-based human computer interaction, health monitoring, foveated rendering, and advanced driver assistance. However, automated pupil detection has proved to be an intricate task in real-world scenarios due to a large mixture of challenges such as quickly changing illumination and occlusions. In this paper, we introduce the Pu pil Re constructor (PuRe), a method for pupil detection in pervasive scenarios based on a novel edge segment selection and conditional segment combination schemes; the method also includes a confidence measure for the detected pupil. The proposed method was evaluated on over 316,000 images acquired with four distinct head-mounted eye tracking devices. Results show a pupil detection rate improvement of over 10 percentage points w.r.t. state-of-the-art algorithms in the two most challenging data sets (6.46 for all data sets), further pushing the envelope for pupil detection. Moreover, we advance the evaluation protocol of pupil detection algorithms by also considering eye images in which pupils are not present and contributing a new data set of mostly closed eyes images. In this aspect, PuRe improved precision and specificity w.r.t. state-of-the-art algorithms by 25.05 and 10.94 percentage points, respectively, demonstrating the meaningfulness of PuRe’s confidence measure. PuRe operates in real-time for modern eye trackers (at 120 fps) and is fully integrated into EyeRecToo – an open-source state-of-the-art software for pervasive head-mounted eye tracking. The proposed method and data set are available at http://www.ti.uni-tuebingen.de/perception .


ubiquitous computing | 2016

Brightness- and motion-based blink detection for head-mounted eye trackers

Tobias Appel; Thiago Santini; Enkelejda Kasneci

Blinks are an indicator for fatigue or drowsiness and can assist in the diagnose of mental disorders, such as schizophrenia. Additionally, a blink that obstructs the pupil impairs the performance of other eye-tracking algorithms, such as pupil detection, and often results in noise to the gaze estimation. In this paper, we present a blink detection algorithm that is tailored towards head-mounted eye trackers and is robust to calibration-based variations like translation or rotation of the eye. The proposed approach reached 96,35% accuracy for a realistic and challenging data set and in real-time even on low-end devices, rendering the proposed method suited for pervasive eye tracking.

Collaboration


Dive into the Thiago Santini's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Flávio Rech Wagner

Universidade Federal do Rio Grande do Sul

View shared research outputs
Top Co-Authors

Avatar

Paolo Rech

Universidade Federal do Rio Grande do Sul

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Luigi Carro

Universidade Federal do Rio Grande do Sul

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tobias Appel

University of Tübingen

View shared research outputs
Top Co-Authors

Avatar

Gabriel L. Nazar

Universidade Federal do Rio Grande do Sul

View shared research outputs
Researchain Logo
Decentralizing Knowledge