Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Erik Strahl is active.

Publication


Featured researches published by Erik Strahl.


robot and human interactive communication | 2016

Using natural language feedback in a neuro-inspired integrated multimodal robotic architecture

Johannes Twiefel; Xavier Hinaut; Marcelo Borghetti; Erik Strahl; Stefan Wermter

In this paper we present a multi-modal human robot interaction architecture which is able to combine information coming from different sensory inputs, and can generate feedback for the user which helps to teach him/her implicitly how to interact with the robot. The system combines vision, speech and language with inference and feedback. The system environment consists of a Nao robot which has to learn objects situated on a table only by understanding absolute and relative object locations uttered by the user and afterwards points on a desired object to show what it has learned. The results of a user study and performance test show the usefulness of the feedback produced by the system and also justify the usage of the system in a real-world applications, as its classification accuracy of multi-modal input is around 80.8%. In the experiments, the system was able to detect inconsistent input coming from different sensory modules in all cases and could generate useful feedback for the user from this information.


Archive | 2014

Object Learning with Natural Language in a Distributed Intelligent System { A Case Study of Human-Robot Interaction

Stefan Heinrich; Pascal Folleher; Peer Springstübe; Erik Strahl; Johannes Twiefel; Cornelius Weber; Stefan Wermter

The development of humanoid robots for helping humans as well as for understanding the human cognitive system is of significant interest in science and technology. How to bridge the large gap between the needs of a natural human-robot interaction and the capabilities of recent humanoid platforms is an important but open question. In this paper we describe a system to teach a robot, based on a dialogue in natural language about its real environment in real time. For this, we integrate a fast object recognition method for the NAO humanoid robot and a hybrid ensemble learning mechanism. With a qualitative analysis we show the effectiveness of our system.


international conference on multisensor fusion and integration for intelligent systems | 2012

Smoke and mirrors — Virtual realities for sensor fusion experiments in biomimetic robotics

Johannes Bauer; Jorge Dávila-Chacón; Erik Strahl; Stefan Wermter

Considerable time and effort often go into designing and implementing experimental set-ups (ES) in robotics. These activities are usually not at the focus of our research and thus go underreported. This results in replication of work and lack of comparability. This paper lays out our view of the theoretical considerations necessary when deciding on the type of experiment to conduct. It describes our efforts in designing a virtual reality (VR) ES for experiments in biomimetic robotics. It also reports on experiments carried out and outlines those planned. It thus provides a basis for similar efforts by other researchers and will help make designing ES more rational and economical, and the results more comparable.


robot and human interactive communication | 2017

NICO — Neuro-inspired companion: A developmental humanoid robot platform for multimodal interaction

Erik Strahl; Sven Magg; Nicolás Navarro-Guerrero; Stefan Heinrich; Stefan Wermter

Interdisciplinary research, drawing from robotics, artificial intelligence, neuroscience, psychology, and cognitive science, is a cornerstone to advance the state-of-the-art in multimodal human-robot interaction and neuro-cognitive modeling. Research on neuro-cognitive models benefits from the embodiment of these models into physical, humanoid agents that possess complex, human-like sensorimotor capabilities for multimodal interaction with the real world. For this purpose, we develop and introduce NICO (Neuro-Inspired COmpanion), a humanoid developmental robot that fills a gap between necessary sensing and interaction capabilities and flexible design. This combination makes it a novel neuro-cognitive research platform for embodied sensorimotor computational and cognitive models in the context of multimodal interaction as shown in our results.


international symposium on neural networks | 2017

Teaching emotion expressions to a human companion robot using deep neural architectures

Nikhil Churamani; Erik Strahl; Pablo V. A. Barros; Stefan Wermter

Human companion robots need to be sociable and responsive towards emotions to better interact with the human environment they are expected to operate in. This paper is based on the Neuro-Inspired COmpanion robot (NICO) and investigates a hybrid, deep neural network model to teach the NICO to associate perceived emotions with expression representations using its on-board capabilities. The proposed model consists of a Convolutional Neural Network (CNN) and a Self-organising Map (SOM) to perceive the emotions expressed by a human user towards NICO and trains two parallel Multilayer Perceptron (MLP) networks to learn general as well as person-specific associations between perceived emotions and the robots facial expressions.


human-agent interaction | 2017

The Impact of Personalisation on Human-Robot Interaction in Learning Scenarios

Nikhil Churamani; Paul Anton; Marc Brügger; Erik Fließwasser; Thomas Hummel; Julius Mayer; Waleed Mustafa; Hwei Geok Ng; Thi Linh Chi Nguyen; Quan Nguyen; Marcus Soll; Sebastian Springenberg; Sascha S. Griffiths; Stefan Heinrich; Nicolás Navarro-Guerrero; Erik Strahl; Johannes Twiefel; Cornelius Weber; Stefan Wermter

Advancements in Human-Robot Interaction involve robots being more responsive and adaptive to the human user they are interacting with. For example, robots model a personalised dialogue with humans, adapting the conversation to accommodate the users preferences in order to allow natural interactions. This study investigates the impact of such personalised interaction capabilities of a human companion robot on its social acceptance, perceived intelligence and likeability in a human-robot interaction scenario. In order to measure this impact, the study makes use of an object learning scenario where the user teaches different objects to the robot using natural language. An interaction module is built on top of the learning scenario which engages the user in a personalised conversation before teaching the robot to recognise different objects. The two systems, i.e. with and without the interaction module, are compared with respect to how different users rate the robot on its intelligence and sociability. Although the system equipped with personalised interaction capabilities is rated lower on social acceptance, it is perceived as more intelligent and likeable by the users.


ieee-ras international conference on humanoid robots | 2014

Robust fall detection with an assistive humanoid robot

German Ignacio Parisi; Erik Strahl; Stefan Wermter

Summary form only given. In this video we introduce a robot assistant that monitors a person in a household environment to promptly detect fall events. In contrast to the use of a fixed sensor, the humanoid robot will track and keep the moving person in the scene while performing daily activities. For this purpose, we extended the humanoid Nao1 with a depth sensor2 attached to its head. The tracking framework implemented with OpenNI3 segments and tracks the persons position and body posture. We use a learning neural framework for processing the extracted body features and detecting abnormal behaviors, e.g. a fall event [1]. The neural architecture consists of a hierarchy of self-organizing neural networks for attenuating noise caused by tracking errors and detecting fall events from video stream in real time. The tracking application, the neural framework, and the humanoid actuators communicate over Robot Operating System (ROS)4. We use communication over the ROS network implemented with publisher-subscriber nodes. When a fall event is detected, Nao will approach the person and ask whether assistance is needed. In any case, Nao will take a picture of the scene that can be sent to the caregiver or a relative for further human evaluation and agile intervention. The combination of this sensor technology with our neural network approach allows to tailor the robust detection of falls independently from the background surroundings and in the presence of noise (tracking errors and occlusions) introduced by a real-world scenario. The video shows experiments run in a home-like environment.


MuSRobS@IROS | 2015

A Multi-modal Approach for Assistive Humanoid Robots.

German Ignacio Parisi; Johannes Bauer; Erik Strahl; Stefan Wermter


arXiv: Robotics | 2018

Deep Neural Object Analysis by Interactive Auditory Exploration with a Humanoid Robot.

Manfred Eppe; Erik Strahl; Stefan Wermter


robot and human interactive communication | 2017

Hey robot, why don't you talk to me?

Hwei Geok Ng; Paul Anton; Marc Brügger; Nikhil Churamani; Erik Fließwasser; Thomas Hummel; Julius Mayer; Waleed Mustafa; Thi Linh Chi Nguyen; Quan Nguyen; Marcus Soll; Sebastian Springenberg; Sascha S. Griffiths; Stefan Heinrich; Nicolás Navarro-Guerrero; Erik Strahl; Johannes Twiefel; Cornelius Weber; Stefan Wermter

Collaboration


Dive into the Erik Strahl's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Stefan Heinrich

Hamburg University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge