Koichi Nishiwaki
Carnegie Mellon University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Koichi Nishiwaki.
intelligent robots and systems | 2007
Philipp Michel; J. Chestnut; Satoshi Kagami; Koichi Nishiwaki; James J. Kuffner; Takeo Kanade
For humanoid robots to fully realize their biped potential in a three-dimensional world and step over, around or onto obstacles such as stairs, appropriate and efficient approaches to execution, planning and perception are required. To this end, we have accelerated a robust model-based three-dimensional tracking system by programmable graphics hardware to operate online at frame-rate during locomotion of a humanoid robot. The tracker recovers the full 6 degree-of- freedom pose of viewable objects relative to the robot. Leveraging the computational resources of the GPU for perception has enabled us to increase our trackers robustness to the significant camera displacement and camera shake typically encountered during humanoid navigation. We have combined our approach with a footstep planner and a controller capable of adaptively adjusting the height of swing leg trajectories. The resulting integrated perception-planning-action system has allowed an HRP-2 humanoid robot to successfully and rapidly localize, approach and climb stairs, as well as to avoid obstacles during walking.
international conference on robotics and automation | 2006
Philipp Michel; Joel E. Chestnutt; Satoshi Kagami; Koichi Nishiwaki; James J. Kuffner; Takeo Kanade
As navigation autonomy becomes an increasingly important research topic for biped humanoid robots, efficient approaches to perception and mapping that are suited to the unique characteristics of humanoids and their typical operating environments are required. This paper presents a system for online environment reconstruction that utilizes both external sensors for global localization, and on-body sensors for detailed local mapping. An external optical motion capture system is used to accurately localize on-board sensors that integrate successive 2D views of a calibrated camera and range measurements from a SwissRanger SR-2 time-of-flight sensor to construct global environment maps in real-time. Environment obstacle geometry is encoded in 2D occupancy grids and 2.5D height maps for navigation planning. We present an on-body implementation for the HRP-2 humanoid robot that, combined with a footstep planner, enables the robot to autonomously traverse dynamic environments containing unpredictably moving obstacles
international conference on robotics and automation | 2006
Joel E. Chestnutt; Philipp Michel; Koichi Nishiwaki; James J. Kuffner; Satoshi Kagami
We present the concept of an intelligent joystick, an architecture which provides simple and intuitive high-level directional control of a legged robot while adjusting the actual foot placements autonomously to avoid stepping in undesirable places. The general concept can be likened to riding a horse: high-level commands are provided, while the intelligence of the underlying system selects proper foot placements with respect to the shape and properties of the underlying terrain and overall balance considerations. We demonstrate a prototype system used for realtime control of the humanoid robot HRP-2
international conference on robotics and automation | 2003
Satoshi Kagami; Koichi Nishiwaki; James J. Kuffner; Kei Okada; Masayuki Inaba; Hirochika Inoue
We present an integrated humanoid locomotion and online terrain modeling system using stereo vision. From a 3D depth map, a 2.5D probabilistic description of the nearby terrain is generated. The depth map is calculated from a pair of stereo camera images, correlation-based localization is performed, and candidate planar walking surfaces are extracted. The results are used to update a probabilistic map of the terrain, which is input to an online footstep planning system. Experimental results are shown using the humanoid robot H7, which was designed as a research platform for intelligent humanoid robotics.
The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) | 2008
Philipp Michel; Joel E. Chestnutt; Satoshi Kagami; Koichi Nishiwaki; James J. Kuffner; Takeo Kanade
We have accelerated a robust model-based 3D tracking system by programmable graphics hardware to run online at frame-rate during operation of a humanoid robot and to efficiently auto-initialize. The tracker recovers the full 6 degree-of-freedom pose of viewable objects relative to the robot. Leveraging the computational resources of the GPU for perception has enabled us to increase our tracker’s robustness to the significant camera displacement and camera shake typically encountered during humanoid navigation. We have combined our approach with a footstep planner and a controller capable of adaptively adjusting the height of swing leg trajectories. The resulting integrated perception-planning-action system has allowed an HRP-2 humanoid robot to successfully and rapidly localize, approach and climb stairs, as well as to avoid obstacles during walking.
Archive | 2007
Philipp Michel; Joel E. Chestnutt; Satoshi Kagami; Koichi Nishiwaki; James J. Kuffner; Takeo Kanadeyz
Journal of the Robotics Society of Japan | 2011
Koichi Nishiwaki; Satoshi Kagami
Journal of the Robotics Society of Japan | 2012
Koichi Nishiwaki; Satoshi Kagami
The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) | 2011
Koichi Nishiwaki; Satoshi Kagami
The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) | 2009
Takahiro Nakada; Michel Philipp; Joel E. Chestnutt; Yasuhide Fukushima; Yoshitaka Yamauchi; Koichi Nishiwaki; Satoshi Kagami; Takeo Kanade; Hiroshi Mizoguchi