Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hidenobu Sumioka is active.

Publication


Featured researches published by Hidenobu Sumioka.


IEEE Transactions on Autonomous Mental Development | 2010

Body Schema in Robotics: A Review

Matej Hoffmann; Hugo Gravato Marques; Alejandro Hernandez Arieta; Hidenobu Sumioka; Max Lungarella; Rolf Pfeifer

How is our body imprinted in our brain? This seemingly simple question is a subject of investigations of diverse disciplines, psychology, and philosophy originally complemented by neurosciences more recently. Despite substantial efforts, the mysteries of body representations are far from uncovered. The most widely used notions-body image and body schema-are still waiting to be clearly defined. The mechanisms that underlie body representations are coresponsible for the admiring capabilities that humans or many mammals can display: combining information from multiple sensory modalities, controlling their complex bodies, adapting to growth, failures, or using tools. These features are also desirable in robots. This paper surveys the body representations in biology from a functional or computational perspective to set ground for a review of the concept of body schema in robotics. First, we examine application-oriented research: how a robot can improve its capabilities by being able to automatically synthesize, extend, or adapt a model of its body. Second, we summarize the research area in which robots are used as tools to verify hypotheses on the mechanisms underlying biological body representations. We identify trends in these research areas and propose future research directions.


robotics and biomimetics | 2011

Information theoretic analysis on a soft robotic arm inspired by the octopus

Kohei Nakajima; Tao Li; Hidenobu Sumioka; Matteo Cianchetti; Rolf Pfeifer

Recent bio-mimetic robotics and embodied intelligence research have revealed the importance of reciprocal and dynamical coupling between the brain (controller), the body, and the environment. A typical example of this is a soft robot that has a diversity of compliant and elastic body dynamics. Coupling between the environment and the controller is expected to be enhanced because of the softness of the robots body. Accordingly, there exists an increasing demand for a method to quantitatively and effectively characterize such coupling regimes. In this paper, we show that the information theoretic approach can be effectively used for this purpose. By using a simple soft robotic platform inspired by the morphology and the material property of an octopus, for example, we analyzed these characteristics and revealed that, because of the compliant and elastic body dynamics, our soft robotic arm was highly sensitive to environmental change.


intelligent robots and systems | 2011

Computation with mechanically coupled springs for compliant robots

Hidenobu Sumioka; Helmut Hauser; Rolf Pfeifer

We introduce a simple model of humans musculoskeletal system to identify the computation that a compliant physical body can achieve. A one-joint system driven by actuation of the springs around the joint is used as a computational device to compute the temporal integration and nonlinear combination of an input signal. Only a linear and static readout unit is needed to extract the output of the computation. The results of computer simulations indicate that the network of mechanically coupled springs can emulate several nonlinear combinations which need temporal integration. The simulation with a two-joint system also shows that, thanks to mechanical connection between the joints, a distant part of a compliant body can serve as a computational device driven by the indirect input. Finally, computational capability of antagonistic muscles and information transfer through mechanical couplings are discussed.


Artificial Life | 2013

Introduction to the special issue on morphological computation

Helmut Hauser; Hidenobu Sumioka; Rudolf Marcel Füchslin; Rolf Pfeifer

The purposeful action of any agent in a complex environment requires control, that is, the determination of specific sequences of values for the parameters determining the state of the agent. In recent years, it became clear that we have to extend our notion of control if we want to understand the mechanical and chemical process management of biological systems. As it turns out, the lessons we learn from this extension can be directly used in engineering—foremost in the field of robotics, but increasingly also in other areas, such as artificial life or novel types of chemical systems design. The extension we refer to includes the intrinsic dynamics of the system to be controlled as an active, even computational element of control. The employment of the physical or chemical dynamics of a system as part of the computations necessary for control is the underlying principle of the concept of morphological computation. In our use of the term “morphology” we include all aspects of a physical system, not only the shape of the agent, the geometrical combination of the body parts, but also its material properties, such as friction coefficients or parameters describing compliance. Moreover, we also consider the distribution of sensors and actuators as part of the morphology. Conventionally (and with slight oversimplification), control is understood in terms of an agent (typically the term plant is used) and a separate entity, which is the control unit and consists of a conventional digital computer. The state space of the agent is spanned by a number of parameters, which in the case of robots are mechanical in nature, but in the case of chemical cells can be densities, phase parameters, and so on. The controller possesses, via its sensors, an (at least partial) internal representation of the state of the agent at any time. In addition, the controller can change the state of the agent via some actuators. In an ideal situation, the controller exerts complete control over the agent, which means it can determine the agentʼs position in state space with high accuracy. In such an ideal situation, the observable behavior is the result of a conventional computation based on the internal representation of the agent in the controller. However, there is always some noise present in the dynamics of the agent. Moreover, if the model of the agent gets too complex (high-dimensional state space, high nonlinearity, noise, etc.), the required controller may be intractable. To avoid both


Archive | 2013

Socially Developmental Robot based on Self-Induced Contingency with Multi Latencies

Hidenobu Sumioka; Yuichiro Yoshikawa; Masanori Morizono; Minoru Asada

Early social development is a process that a human infant and his/her caregiver adapt to each other. This paper presents a learning mechanism to find the contingency of human-robot interaction in the real world, which is intended to enable similar process to the mutual adaptation in the infant-caregiver interactions. A contingency measure based on information theory is applied not only to acquire behavior rules but also to find suitable latency to observe the found contingency. Experimental results show that a robot can acquire a series of social behavior such as gaze following and utterance to a human subject through 20 min interaction. Mutual adaptation between them is discussed in terms of transition and synchronization of their behavior, based on the analysis of the interaction data.


robotics and biomimetics | 2013

The effect of spine actuation and stiffness on a pneumatically-driven quadruped robot for cheetah-like locomotion

Qian Zhao; Benjamin Ellenberger; Hidenobu Sumioka; Timothy Sandy; Rolf Pfeifer

Biological research has concluded that the actuation of the spine contributes significantly to the performance of quadrupeds in terms of controlling body posture, and integrating limbs and trunk actions. Inspired by this biological findings, we develop a pneumatically-driven quadruped robot called Renny with configurable spine morphology to study how the spine contributes to cheetah-like running. Three spine morphologies: rigid spine, passive spine, and actuated spine, are introduced and tested in Renny robot. In addition, we investigate the effect of the stiffness distribution of the spine muscles in the passive case. The experimental results show that the passive one where the dorsal stiffness is higher than the ventral stiffness can run faster, even faster than the rigid case. Moreover, the coordination between the legs and the actuated spine is studied in actuated spine morphology. We found that when the spinal movements are synchronized with the legs movements, the speed is much faster. In the actuated case, both flexion and extension benefit the increase of the speed by advancing limbs rapidly and increasing the limb swing.


simulation of adaptive behavior | 2010

On the influence of sensor morphology on vergence

Harold Martinez; Hidenobu Sumioka; Max Lungarella; Rolf Pfeifer

In the field of developmental robotics, a lot of attention has been devoted to algorithms that allow agents to build up skills through sensorimotor interaction. Such interaction is largely affected by the agents morphology, that is, its shape, limb articulation, as well as the position and density of sensors on its body surface. Despite its importance, the impact of morphology on behavior has not been systematically addressed. In this paper, we take inspiration from the human vision system, and demonstrate using a binocular active vision platform why sensor morphology in combination with other properties of the body, are essential conditions to achieve coordinated visual behavior (here, vergence). Specifically, to evaluate the effect of sensor morphology on behavior, we present an information-theoretic analysis quantifying the statistical regularities induced through sensorimotor interaction. Our results show that only for an adequate sensor morphology, vergence increases the amount of information structure in the sensorimotor loop.


Archive | 2018

Regulating Emotion with Body Ownership Transfer

Shuichi Nishio; Koichi Taura; Hidenobu Sumioka; Hiroshi Ishiguro

In this study, we experimentally examined whether changes in the facial expressions of teleoperated androids can affect and regulate their operators’ emotion, based on the facial feedback theory of emotion and the phenomenon of body ownership transfer to the robot. Twenty-six Japanese participants conversed with an experimenter through a robot in a situation where the participants were induced to feel anger, and during the conversation, the android’s facial expression was changed according to a pre-programmed scheme. The results showed that facial feedback from the android did occur. Moreover, a comparison of the results of two groups of participants, one of which operated the robot and the second did not, showed that this facial feedback from the android robot occurred only when the participants operated the robot, and that when an operator could effectively operate the robot, his/her emotional states were more affected by the facial expression change of the robot.


Archive | 2018

Body Ownership Transfer by Social Interaction

Shuichi Nishio; Koichi Taura; Hidenobu Sumioka; Hiroshi Ishiguro

Body ownership transfer (BOT) comprises the illusion that we feel external objects as parts of our own body, which occurs when teleoperating android robots. In past studies, we investigated the conditions under which this illusion occurs. However, these studies were conducted using only simple operation tasks, such as moving only the robot’s hand. Does this illusion occur during more complex tasks, such as conducting a conversation? What kind of conversation setting is required to invoke this illusion? In this study, we examined the manner in which factors in social interaction affect the occurrence of BOT. Participants conversed using the teleoperated robot under different conditions and teleoperation settings. The results revealed that BOT does occur during the task of conducting a conversation, and that the conversation partner’s presence and appropriate responses are necessary to enhance BOT.


The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) | 2008

2A1-E21 Finding A Chain of Causality Leads Social Referencing

Hidenobu Sumioka; Yuichiro Yoshikawa; Minoru Asada

The development of joint attention related actions, such as gaze following and gaze alternation, is one of mysteries in infant development. Previous synthetic studies have proposed learning methods for gaze following without any explicit instructions as first step to understand such development. However, a robot was given a priori knowledge about which pair of sensory information and action should be associated. In this paper, we propose a learning mechanism that automatically and iteratively acquires social behavior by detecting and reproducing the causality inherent in interaction with a caregiver without such knowledge. The measurement of causality based on transfer entropy [1] is used to detect appropriate pairs of variables for acquiring social actions. The reproduction of the detected causality promotes other causality. In the computer simulation of human-robot interaction, we examine what kinds of behavior related to joint attention can be acquired sequentially by changing the behaviors of caregiver agent. The result indicates the actions are acquired in similar order to infant development.

Collaboration


Dive into the Hidenobu Sumioka's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge