Atsushi Haneda
University of Tokyo
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Atsushi Haneda.
international conference on mechatronics and automation | 2005
Kei Okada; Takashi Ogura; Atsushi Haneda; Junya Fujimoto; Fabien Gravot; Masayuki Inaba
This paper describes software system for a humanoid robot viewed from a motion generation aspect by taking kitchen helping behaviors as an example of a real world task. Our software consists of high level reasoning modules including 3D geometric model based action/motion planner and runtime modules contains 3D visual processor, force manipulation controller and walking controller. We discuss how a high level motion and action planner based motion generation functions contribute to various real world humanoid tasks and a role of perception based motion generation using a vision sensor and force sensors.
intelligent robots and systems | 2004
Kei Okada; Atsushi Haneda; Hiroyuki Nakai; Masayuki Inaba; Hirochika Inoue
In this paper, we describe a planner for a humanoid robot that is capable of finding a path in an environment with movable objects, whereas previous motion planner only deals with an environment with fixed objects. We address an environment manipulation problem for a humanoid robot that finds a walking path from the given start location to the goal location while displacing obstructing objects on the walking path. This problem requires more complex configuration space than previous researches using a mobile robot especially in a manipulation phase, since a humanoid robot has many degrees of freedom in its arm than a forklift type robot. Our approach is to build environment manipulation task graph that decompose the given task into subtasks which are solved using navigation path planner or whole body motion planner. We also propose a standing location search and a displacing obstacle location search for connecting subtasks. Efficient method to solve manipulation planning that relies on whole body inverse kinematics and motion planning technology is also shown. Finally, we show experimental results in an environment with movable objects such as chairs and trash boxes. The planner finds an action sequence consists of walking paths and manipulating obstructing objects to walk from the start position to the goal position.
international conference on robotics and automation | 2004
Kei Okada; Takashi Ogura; Atsushi Haneda; Daisuke Kousaka; Hiroyuki Nakai; Masayuki Inaba; Hirochika Inoue
This paper describes the design and development of system software for humanoid robots such that researchers who specialize not only in biped walking but also in various fields are able to use humanoid robots as a research tool. For this purpose, the system for a humanoid must integrate and organize each subsystem such as control, recognition, dialogue, planning and so on, and it must provide efficient full-body motion control by specifying fewer degrees of freedom than all joints. Our system design provides a common interface among subsystems by implementing each function as a method call through a three-dimensional model of the robot for good integration, and it also provides a motion planning technique based full-body posture sequence and walking pattern generation. Finally, we show integrated behavior experiments with vision, planning and motion control using the developed system software for a life-sized humanoid robot, HRP2.
international conference on robotics and automation | 2005
Kei Okada; Takashi Ogura; Atsushi Haneda; Masayuki Inaba
This paper describes vision-based 3D walking system of a humanoid robot by combining a precise 3D planar surface detection method and a practical 3D footstep planner method. The walking control system requires vision system with 10[mm] accuracy. Then we developed the precise 3D planar surface recognition system by combining the 3D Hough transformation method and the robust estimation method. We also developed practical 3D foot step planner by considering kinematics and dynamics restriction of robot hardware. Finally, we realized vision based 3D walking experiments that a humanoid robot steps upon an unknown obstacle are shown.
international conference on robotics and automation | 2006
Fabien Gravot; Atsushi Haneda; Kei Okada; Masayuki Inaba
This paper presents a work toward the old dream of the housekeeping robot. One humanoid robot will cooperate with the user to cook simple dishes. The system combines predefined tasks and dialogues to find a plan in which both robot and user help each other in the kitchen. The kitchen problem allows the demonstration of a large variety of actions, and then the necessity to find and to plan those actions. With this problem the task planner can be fully used to enhance the robot reasoning capacity. Furthermore the robot must also use motion planning to have general procedures to cope with the action planned. We focus on the planning problems and the interactions of these two planning methods
international conference on mechatronics and automation | 2008
Atsushi Haneda; Kei Okada; Masayuki Inaba
This paper presents a new robust realtime manipulation planning system for robot to manipulate movable objects under interactive dynamics simulator. Previous manipulation planning system was not robust under environment which dynamically changes. To solve this problem, we propose an integrated manipulation planning system composed of symbolic task planner, geometric motion planner and interactive dynamics simulator. Iterative planning process based on realtime strategy is done through global motion planning based symbolic representation, local motion planning which generates robot geometric actions and interactive dynamics simulator which controls both kinematic and dynamic simulation modes interactively. In experiments, we demonstrate that or realtime manipulation planning system can solve manipulation problem with several objects efficiently and robustly under dynamic environment.
ieee-ras international conference on humanoid robots | 2005
Takashi Ogura; Atsushi Haneda; Kei Okada; Masayuki Inaba
In order to realize humanoid robots that support people in their daily lives, on-site humanoid behavior navigation system which a robot behaviors autonomously as well as a robot reacts to the intentions of a human. This paper proposes new on-site navigation interface method for humanoid robots which a human navigate a humanoid robot hand-in-hand. Previous humanoid researches aim to realize an autonomous intelligence, which is able to survive in a real environment without any help from a person. However to utilize a humanoid robot in a real world, it is necessary to take a robot to various locations or make a robot to manipulate and use various kinds of objects and tools, on-site navigation that a person is able to induce a humanoid behavior. In the experiment the operator takes a humanoid robot to a kitchen and teaches holding a kettle using this interaction kernel, and played back the motion sequence by planning in different environments
international conference on mechatronics and automation | 2008
Atsushi Haneda; Kei Okada; Masayuki Inaba
The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) | 2008
Atsushi Haneda; Kei Okada; Masayuki Inaba
The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) | 2005
Atsushi Haneda; Kei Okada; Masayuki Inaba