Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hirohisa Hirukawa is active.

Publication


Featured researches published by Hirohisa Hirukawa.


international conference on robotics and automation | 2003

Biped walking pattern generation by using preview control of zero-moment point

Shuuji Kajita; Fumio Kanehiro; Kenji Kaneko; Kiyoshi Fujiwara; Kensuke Harada; Kazuhito Yokoi; Hirohisa Hirukawa

We introduce a new method of a biped walking pattern generation by using a preview control of the zero-moment point (ZMP). First, the dynamics of a biped robot is modeled as a running cart on a table which gives a convenient representation to treat ZMP. After reviewing conventional methods of ZMP based pattern generation, we formalize the problem as the design of a ZMP tracking servo controller. It is shown that we can realize such controller by adopting the preview control theory that uses the future reference. It is also shown that a preview controller can be used to compensate the ZMP error caused by the difference between a simple model and the precise multibody model. The effectiveness of the proposed method is demonstrated by a simulation of walking on spiral stairs.


international conference on robotics and automation | 2004

Humanoid robot HRP-2

Kenji Kaneko; Fumio Kanehiro; Shuuji Kajita; Hirohisa Hirukawa; Toshikazu Kawasaki; Masaru Hirata; Kazuhiko Akachi; Takakatsu Isozumi

In this paper, the development of humanoid robot HRP-3 is presented. HRP-3, which stands for Humanoid Robotics Platform-3, is a human-size humanoid robot developed as the succeeding model of HRP-2. One of features of HRP-3 is that its main mechanical and structural components are designed to prevent the penetration of dust or spray. Another is that its wrist and hand are newly designed to improve manipulation. Software for a humanoid robot in a real environment is also improved. We also include information on mechanical features of HRP-3 and together with the newly developed hand. Also included are the technologies implemented in HRP-3 prototype. Electrical features and some experimental results using HRP-3 are also presented.


intelligent robots and systems | 2001

The 3D linear inverted pendulum mode: a simple modeling for a biped walking pattern generation

Shuuji Kajita; Fumio Kanehiro; Kenji Kaneko; Kazuhito Yokoi; Hirohisa Hirukawa

For 3D walking control of a biped robot we analyze the dynamics of a 3D inverted pendulum in which motion is constrained to move along an arbitrarily defined plane. This analysis yields a simple linear dynamics, the 3D linear inverted pendulum mode (3D-LIPM). Geometric nature of trajectories under the 3D-LIPM and a method for walking pattern generation are discussed. A simulation result of a walking control using a 12-DOF biped robot model is also shown.


intelligent robots and systems | 2003

Resolved momentum control: humanoid motion planning based on the linear and angular momentum

Shuuji Kajita; Fumio Kanehiro; Kenji Kaneko; Kiyoshi Fujiwara; Kensuke Harada; Kazuhito Yokoi; Hirohisa Hirukawa

We introduce a method to generate whole body motion of a humanoid robot such that the resulted total linear/angular momenta become specified values. First, we derive a linear equation, which gives to total momentum of a robot from its physical parameters, the base link speed and the joint speeds. Constraints between the legs and the environment are also considered. The whole body motion is calculated from a given momentum reference by using a pseudo-inverse of the inertia matrix. As examples, we generated the kicking and walking motions and tested on the actual humanoid robot HRP-2. This method, the resolved momentum control, gives us a unified framework to generate various maneuvers of humanoid robots.


international conference on robotics and automation | 2002

A realtime pattern generator for biped walking

Shuuji Kajita; Fumio Kanehiro; Kenji Kaneko; Kiyoshi Fujiwara; Kazuhito Yokoi; Hirohisa Hirukawa

For real-time walking control of a biped robot, we analyze the dynamics of a three-dimensional inverted pendulum whose motions are constrained onto an arbitrarily defined plane. This analysis leads us a simple linear dynamics, the Three-Dimensional Linear Inverted Pendulum Mode (3D-LIPM). Geometric nature of trajectories under the 3D-LIPM is discussed, and an algorithm for walking pattern generation is presented. Experimental results of real-time walking control of a 12-DOF biped robot HRP-2L using an input device such as a game pad are also shown.


intelligent robots and systems | 2006

Biped Walking Pattern Generator allowing Auxiliary ZMP Control

Shuuji Kajita; Mitsuharu Morisawa; Kensuke Harada; Kenji Kaneko; Fumio Kanehiro; Kiyoshi Fujiwara; Hirohisa Hirukawa

A biped walking pattern generator which allows an additional ZMP control (auxiliary ZMP) is presented. An auxiliary ZMP is realized by an inverse system added to a pattern generator based on the ZMP preview control. To compensate the effect of the auxiliary ZMP, we apply virtual time shifting of the reference ZMP. As an application of the proposed method, a walking control on uneven terrain is simulated. The simulated robot can walk successfully by changing its walking speed as the side effect of the auxiliary ZMP control


ieee-ras international conference on humanoid robots | 2004

An analytical method on real-time gait planning for a humanoid robot

Kensuke Harada; Shuuji Kajita; Kenji Kaneko; Hirohisa Hirukawa

This paper studies the real-time gait planning for a humanoid robot. By simultaneously planning the trajectories of the COG (center of gravity) and the ZMP (zero moment point), the fast and smooth change of gait, can be realized. The change of gait is also realized by connecting the newly calculated trajectories to the current ones. While we propose two methods for connecting two trajectories, i.e. the real-time method and the quasi-real-time one, we show that the stable change of gait can be realized by using the quasi-real-time method even if the change of the step position is significant. The effectiveness of the proposed methods is confirmed by simulation and experiment.


intelligent robots and systems | 2004

Robust speech interface based on audio and video information fusion for humanoid HRP-2

Isao Hara; Futoshi Asano; Hideki Asoh; Jun Ogata; Naoyuki Ichimura; Yoshihiro Kawai; Fumio Kanehiro; Hirohisa Hirukawa; Kiyoshi Yamamoto

For cooperative work of robots and humans in the real world, a communicative function based on speech is indispensable for robots. To realize such a function in a noisy real environment, it is essential that robots be able to extract target speech spoken by humans from a mixture of sounds by their own resources. We have developed a method of detecting and extracting speech events based on the fusion of audio and video information. In this method, audio information (sound localization using a microphone array) and video information (human tracking using a camera) are fused by a Bayesian network to enable the detection of speech events. The information of detected speech events is then utilized in sound separation using adaptive beam forming. In this paper, some basic investigations for applying the above system to the humanoid robot HRP-2 are reported. Input devices, namely a microphone array and a camera, were mounted on the head of HRP-2, and acoustic characteristics for sound localization/separation performance were investigated. Also, the human tracking system was improved so that it can be used in a dynamic situation. Finally, overall performance of the system was tested via off-line experiments.


The International Journal of Robotics Research | 2007

Learning from Observation Paradigm: Leg Task Models for Enabling a Biped Humanoid Robot to Imitate Human Dances

Shin’ichiro Nakaoka; Atsushi Nakazawa; Fumio Kanehiro; Kenji Kaneko; Mitsuharu Morisawa; Hirohisa Hirukawa; Katsushi Ikeuchi

This paper proposes a framework that achieves the Learning from Observation paradigm for learning dance motions. The framework enables a humanoid robot to imitate dance motions captured from human demonstrations. This study especially focuses on leg motions to achieve a novel attempt in which a biped-type robot imitates not only upper body motions but also leg motions including steps. Body differences between the robot and the original dancer make the problem difficult because the differences prevent the robot from straightforwardly following the original motions and they also change dynamic body balance. We propose leg task models, which play a key role in solving the problem. Low-level tasks in leg motion are modelled so that they clearly provide essential information required for keeping dynamic stability and important motion characteristics. The models divide the problem of adapting motions into the problem of recognizing a sequence of the tasks and the problem of executing the task sequence. We have developed a method for recognizing the tasks from captured motion data and a method for generating the motions of the tasks that can be executed by existing robots including HRP-2. HRP-2 successfully performed the generated motions, which imitated a traditional folk dance performed by human dancers.


international conference on robotics and automation | 2003

Cooperative works by a human and a humanoid robot

Kazuhiko Yokoyama; Hiroyuki Handa; Takakatsu Isozumi; Yutaro Fukase; Kenji Kaneko; Fumio Kanehiro; Yoshihiro Kawai; Fumiaki Tomita; Hirohisa Hirukawa

We have developed a humanoid robot HRP-2P with a biped locomotion controller, stereo vision software and aural human interface to realize cooperative works by a human and a humanoid robot. The robot can find a target object by the vision, and carry it cooperatively with a human by biped locomotion according to the voice commands by the human. A cooperative control is applied to the arms of the robot while it carries the object, and the walking direction of the robot is controlled by the interactive force and torque through the force/torque sensor on the wrists. The experimental results are presented in the paper.

Collaboration


Dive into the Hirohisa Hirukawa's collaboration.

Top Co-Authors

Avatar

Shuuji Kajita

National Institute of Advanced Industrial Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Fumio Kanehiro

National Institute of Advanced Industrial Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Kenji Kaneko

National Institute of Advanced Industrial Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Kiyoshi Fujiwara

National Institute of Advanced Industrial Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kazuhito Yokoi

National Institute of Advanced Industrial Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Mitsuharu Morisawa

National Institute of Advanced Industrial Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Shin'ichiro Nakaoka

National Institute of Advanced Industrial Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Takakatsu Isozumi

National Institute of Advanced Industrial Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Hajime Saito

National Institute of Advanced Industrial Science and Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge