Dragos Datcu
Delft University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Dragos Datcu.
computer systems and technologies | 2008
Robert Horlings; Dragos Datcu; Léon J. M. Rothkrantz
Our project focused on recognizing emotion from human brain activity, measured by EEG signals. We have proposed a system to analyze EEG signals and classify them into 5 classes on two emotional dimensions, valence and arousal. This system was designed using prior knowledge from other research, and is meant to assess the quality of emotion recognition using EEG signals in practice. In order to perform this assessment, we have gathered a dataset with EEG signals. This was done by measuring EEG signals from people that were emotionally stimulated by pictures. This method enabled us to teach our system the relationship between the characteristics of the brain activity and the emotion. We found that the EEG signals contained enough information to separate five different classes on both the valence and arousal dimension. However, using a 3-fold cross validation method for training and testing, we reached classification rates of 32% for recognizing the valence dimension from EEG signals and 37% for the arousal dimension. Much better classification rates were achieved when using only the extreme values on both dimensions, the rates were 71% and 81%.
computer systems and technologies | 2007
Dragos Datcu; Léon J. M. Rothkrantz
The paper highlights the performance of video sequence-oriented facial expression recognition using Active Appearance Model -- AAM, in a comparison with the analysis based on still pictures. The AAM is used to extract relevant information regarding the shapes of the faces to be analyzed. Specific key points from a Facial Characteristic Point - FCP model are used to derive the set of features. These features are used for the classification of the expressions of a new face sample into the prototypic emotions. The classification method uses Support Vector Machines.
international conference on multimedia and expo | 2005
Dragos Datcu; Léon J. M. Rothkrantz
For many decades automatic facial expression recognition has scientifically been considered a real challenging problem in the fields of pattern recognition or robotic vision. The current research aims at proposing relevance vector machines (RVM) as a novel classification technique for the recognition of facial expressions in static images. The aspects related to the use of Support Vector Machines are also presented. The data for testing were selected from the Cohn-Kanade facial expression database. We report 90.84% recognition rates for RVM for six universal expressions based on a range of experiments. Some discussions on the comparison of different classification methods are included
computer systems and technologies | 2011
Dragos Datcu; Léon J. M. Rothkrantz
This paper proposes a bimodal system for emotion recognition that uses face and speech analysis. Hidden Markov models - HMMs are used to learn and to describe the temporal dynamics of the emotion clues in the visual and acoustic channels. This approach provides a powerful method enabling to fuse the data we extract from separate modalities. The paper presents the best performing models and the results of the proposed recognition system.
symposium on spatial user interaction | 2013
Dragos Datcu; Stephan Lukosch
The ability to use free-hand gestures is extremely important for mobile augmented reality applications. This paper proposes a computer vision-driven model for natural free-hands interaction in augmented reality. The novelty of the research is the use of robust hand modeling by combining Viola&Jones and Active Appearance Models. A usability study evaluates the hands free interaction model in with a focus on the accuracy of hand based pointing for menu navigation and menu item selection. The results indicate high accuracy of pointing and high usability of the free-hands interaction in augmented reality. The research is part of a joint project of TU Delft and the Netherlands Forensic Institute in The Hague, aiming at the development of novel technologies for crime scene investigations.
intelligence and security informatics | 2014
Dragos Datcu; Marina-Anca Cidotã; Heide Lukosch; Stephan Lukosch
For operational units in the security domain that work together in teams it is important to quickly and adequately exchange context-related information. Currently, information exchange is based on oral communication only. This paper reports on different scenarios from the security domain in which augmented reality (AR) techniques are used to support such information exchange. The scenarios have been elicited using an end-user centred design approach. To support these scenarios an AR environment has been developed and the usability of the AR support has been evaluated with experts from different operational units in the security domain. The first evaluation shows that the scenarios are well defined and the AR environment can successfully support information exchange in teams operating in the security domain.
conference on computer supported cooperative work | 2015
Stephan Lukosch; Heide Lukosch; Dragos Datcu; Marina-Anca Cidotã
For operational units in the security domain that work together in teams, it is important to quickly and adequately exchange context-related information to ensure well-working collaboration. Currently, most information exchange is based on oral communication. This paper reports on different scenarios from the security domain in which augmented reality (AR) techniques are used to support such information exchange. The scenarios have been designed with a User Centred Design approach, in order to make the scenarios as realistic as possible. To support these scenarios, an AR system has been developed and evaluated in two rounds. In the first round, the usability and feasibility of the AR support has been evaluated with experts from different operational units in the security domain. The second evaluation round then focussed on the effect of AR on collaboration and situational awareness within the expert teams. With regard to the usability and feasibility of AR, the evaluation shows that the scenarios are well defined and the AR system can successfully support information exchange in teams operating in the security domain. The second evaluation round showed that AR can especially improve the situational awareness of remote colleagues not physically present at a scene.
computer systems and technologies | 2013
Dragos Datcu; Marina-Anca Cidotã; Stephan Lukosch; Léon J. M. Rothkrantz
The current paper presents a comparative study on the influence of different face regions for contactless extraction of the heart rate by computer vision in visible spectrum. A second novelty of our research is the use of Active Appearance Models for computing the shape of the face and of the facial features. Following an experimental setup, we determine that forehead and cheek face regions are more relevant for computing the heart rate. This outcome leads to an optimized face scanning method, faster processing times and better pulse detection results. The findings were implemented in an automatic system prototype for noncontact face analysis. Our methods were tested and validated using video recordings of people in laboratory setup.
International Journal of Human-computer Interaction | 2015
Dragos Datcu; Stephan Lukosch; Frances M. T. Brazier
One of the key challenges of augmented reality (AR) interfaces is to design effective hand-based interaction supported by computer vision. Hand-based interaction requires free-hands tracking to support user interaction in AR for which this article presents a novel approach. This approach makes it possible to compare different types of hand-based interaction in AR for navigating using a spatial user interface. Quantitative and qualitative analyses of a study with 25 subjects indicate that tangible interaction is the preferred type of interaction with which to determine the position of the user interface in AR and to physically point to a preferred option for navigation in augmented reality.
systems, man and cybernetics | 2006
Siska Fitrianie; Dragos Datcu; Léon J. M. Rothkrantz
Knowing the situational information about the current world is the precondition for setting goals and domains of actions. Special events in the world can also be used to initiate new actions or interrupt ongoing processes. A prototype of a communication interface has been developed which enables users to create visual messages to represent concepts or ideas in mind. The messages are constructed using a spatial arrangement of visual symbols. The interface has been tested by observers in a simulated crisis situation. The system processes the incoming message to build a world model by the employment of ontology. A blackboard of a dynamic ad-hoc network shares the knowledge and updates the display of the mobile devices of the users within the same region of the crisis event.