Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ioannis Iossifidis is active.

Publication


Featured researches published by Ioannis Iossifidis.


intelligent robots and systems | 2006

Dynamical Systems Approach for the Autonomous Avoidance of Obstacles and Joint-limits for an Redundant Robot Arm

Ioannis Iossifidis; Gregor Schöner

We extend the attractor dynamics approach to generate goal-directed movement of a redundant, anthropomorphic arm while avoiding dynamic obstacles and respecting joint limits. To make the robots movements human-like, we generate approximately straight-line trajectories by using two heading direction angles of the tool-point quite analogously to how movement is represented in the primate central nervous system. Two additional angles control the tools spatial orientation so that it follows the tool-points collision-free path. A fifth equation governs the redundancy angle, which controls the elevation of the elbow so as to avoid obstacles and respect joint limits. These variables make it possible to generate movement while sitting in an attractor (or, in the language of the potential field approach, in a minimum). We demonstrate the approach on an assistant robot, which interacts with human users in a shared workspace


international conference on robotics and automation | 2004

Autonomous reaching and obstacle avoidance with the anthropomorphic arm of a robotic assistant using the attractor dynamics approach

Ioannis Iossifidis; Gregor Schöner

To enable a robotic assistant to autonomously reach for and transport objects while avoiding obstacles we have generalized the attractor dynamics approach established for vehicles to trajectory formation in robot arms. This approach is able to deal with the time-varying environments that occur when a human operator moves in a shared workspace. Stable fixed points (attractors) for the heading direction of the end-effector shift during movement and are being tracked by the system. This enables the attractor dynamics approach to avoid the spurious states that hamper potential field methods. Separating planning and control computationally, the approach is also simpler to implement. The stability properties of the movement plan make it possible to deal with fluctuating and imprecise sensory information. We implement this approach on a seven degree of freedom anthropomorphic arm reaching for objects on a working surface. We use an exact solution of the inverse kinematics, which enables us to steer the spatial position of the elbow clear of obstacles. The straight-line trajectories of the end-effector that emerge as long as the arm is far from obstacles make the movement goals of the robotic assistant predictable for the human operator, improving man-machine interaction.


IEEE Transactions on Autonomous Mental Development | 2011

Dynamic Neural Fields as Building Blocks of a Cortex-Inspired Architecture for Robotic Scene Representation

Stephan K. U. Zibner; Christian Faubel; Ioannis Iossifidis; Gregor Schöner

Based on the concepts of dynamic field theory (DFT), we present an architecture that autonomously generates scene representations by controlling gaze and attention, creating visual objects in the foreground, tracking objects, reading them into working memory, and taking into account their visibility. At the core of this architecture are three-dimensional dynamic neural fields (DNFs) that link feature to spatial information. These three-dimensional fields couple into lower dimensional fields, which provide the links to the sensory surface and to the motor systems. We discuss how DNFs can be used as building blocks for cognitive architectures, characterize the critical bifurcations in DNFs, as well as the possible coupling structures among DNFs. In a series of robotic experiments, we demonstrate how the DNF architecture provides the core functionalities of a scene representation.


intelligent robots and systems | 2003

Anthropomorphism as a pervasive design concept for a robotic assistant

Ioannis Iossifidis; Christoph Theis; Claudia Grote; Christian Faubel; Gregor Schöner

CORA is a robotic assistant whose task is to collaborate with a human operator on simple manipulation or handling tasks. Its sensory channels comprising vision, audition, haptics, and force sensing are used to extract perceptual information about speech, gestures and gaze of the operator, and object recognition. The anthropomorphic robot arm makes goal-directed movements to pick up and hand over objects. The human operator may mechanically interact with the arm by pushing it away (haptics) or by taking an object out of the robots gripper (force sensing). The design objective has been to exploit the human operators intuition by modeling the mechanical structure, the senses, and the behaviors of the assistant on human anatomy, human perception, and human motor behavior.


robot and human interactive communication | 2002

CORA: An anthropomorphic robot assistant for human environment

Ioannis Iossifidis; C. Bruckhoff; Christoph Theis; Claudia Grote; Christian Faubel; Gregor Schöner

We describe the general concept, system architecture, hardware, and the behavioral abilities of CORA (Cooperative Robot Assistant), an autonomous nonmobile robot assistant. Outgoing from our basic assumption that the behavior to perform determines the internal and external structure of the behaving system, we have designed CORA anthropomorphic to allow for humanlike behavioral strategies in solving complex tasks. Although CORA was built as a prototype of a service robot system to assist a human partner in industrial assembly tasks, we will show that CORAs behavioral abilities are also conferrable in a household environment. After the description of the hardware platform and the basic concepts of our approach, we present some experimental results by means of an assembly task.


international conference on intelligent transportation systems | 2008

Towards a Driver Model: Preliminary Study of Lane Change Behavior

Ueruen Dogan; Hannes Edelbrunner; Ioannis Iossifidis

The presented work formulates an framework in which early prediction of drivers lane change behavior is realized. We aim to build a representation of drivers lane change behavior in order to recognize and to predict drivers intentions as a first step towards a realistic driver model. In the test bed of the Institute of Neuroinformatik, based on the traffic simulator NISYS TRS 1, 10 individuals have driven in the experiments and they performed more then 150 lane change maneuvers. Lane-offset, distance to the front car and time to contact, were recorded. The acquired data was used to train - in parallel- a recurrent neural network, a feed forward neural network and a set of support vector machines. In the followed test drives the system was able of performing a lane change prediction time of 1.5 sec beforehand. The proposed approach describes a framework for lane-change detection and prediction, which will serve as a prerequisite for a successful driver model.


international conference on robotics and automation | 2011

Autonomous movement generation for manipulators with multiple simultaneous constraints using the attractor dynamics approach

Hendrik Reimann; Ioannis Iossifidis; Gregor Schöner

The movement of autonomous agents in natural environments is restricted by potentially large numbers of constraints. To generate behavior that fulfills all given constraints simultaneously, the attractor dynamics approach to movement generation represents each constraint by a dynamical system with attractors or repellors at desired or undesired values of a relevant variable. These dynamical systems are transformed into vector fields over the control variables of a robotic agent that force the state of the whole system in directions beneficial to the satisfaction of the behavioral constraint. The attractor dynamics approach was recently successfully applied to the generation of manipulator motion trajectories avoiding collision with obstacles [1] and constraints on gripper orientation during reaching and grasping movements [2]. Continuing that body of work, this paper proposes a system which generates movements satisfying both obstacle avoidance and gripper orientation constraints simultaneously. As an extension, the additional constraint of avoiding hardware limits for joint angles is included. Properties of the resulting system are demonstrated by a systematic study generating movements with a large number of constraints in different scene setups. Specific characteristics are highlighted by several showcase example movements.


robot and human interactive communication | 2010

Natural human-robot interaction through spatial language: A Dynamic Neural Field approach

Yulia Sandamirskaya; John Lipinski; Ioannis Iossifidis; Gregor Schöner

For an autonomous robotic system, the ability to share the same workspace and interact with humans is the basis for cooperative behavior. In this work, we investigate human spatial language as the communicative channel between the robot and the human, facilitating their joint work on a tabletop. We specifically combine the theory of Dynamic Neural Fields that represent perceptual and cognitive states with motor control and linguistic input in a robotic demonstration. We show that such a neural dynamic framework can integrate across symbolic, perceptual, and motor processes to generate task-specific spatial communication in real time.


intelligent robots and systems | 2010

Generating collision free reaching movements for redundant manipulators using dynamical systems

Hendrik Reimann; Ioannis Iossifidis; Gregor Schöner

For autonomous robots to manipulate objects in unknown environments, they must be able to move their arms without colliding with nearby objects, other agents or humans. The simultaneous avoidance of multiple obstacles in real time by all link segments of a manipulator is still a hard task both in practice and in theory. We present a systematic scheme for the generation of collision free movements for redundant manipulators in scenes with arbitrarily many obstacles. Based on the dynamical systems approach to robotics, constraints are formulated as contributions to a dynamical system that erect attractors for targets and repellors for obstacles. These contributions are formulated in terms of variables relevant to each constraint and then transformed into vector fields over the manipulator joint velocity vector as an embedding space in which all constraints are simultaneously observed. We demonstrate the feasibility of the approach by implementing it on a real anthropomorphic 8-degrees-of-freedom redundant manipulator. In addition, performance is characterized by detecting failures in a systematic simulation experiment in randomized scenes with varying numbers of obstacles.


robot and human interactive communication | 2001

Image processing methods for interactive robot control

C. Theis; Ioannis Iossifidis; A. Steinhage

In this paper we describe a straight forward technique for tracking a human hand based on images acquired by an active stereo camera system. We demonstrate the implementation of this method on an anthropomorphic assistance robot as part of a multi-modal man-machine interaction system: detecting the hand-position, the robot can interprete a human pointing gesture as the specification of a target object to grasp.

Collaboration


Dive into the Ioannis Iossifidis's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

C. Theis

Ruhr University Bochum

View shared research outputs
Researchain Logo
Decentralizing Knowledge