Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Koichi Nishiwaki is active.

Publication


Featured researches published by Koichi Nishiwaki.


intelligent robots and systems | 2007

GPU-accelerated real-time 3D tracking for humanoid locomotion and stair climbing

Philipp Michel; J. Chestnut; Satoshi Kagami; Koichi Nishiwaki; James J. Kuffner; Takeo Kanade

For humanoid robots to fully realize their biped potential in a three-dimensional world and step over, around or onto obstacles such as stairs, appropriate and efficient approaches to execution, planning and perception are required. To this end, we have accelerated a robust model-based three-dimensional tracking system by programmable graphics hardware to operate online at frame-rate during locomotion of a humanoid robot. The tracker recovers the full 6 degree-of- freedom pose of viewable objects relative to the robot. Leveraging the computational resources of the GPU for perception has enabled us to increase our trackers robustness to the significant camera displacement and camera shake typically encountered during humanoid navigation. We have combined our approach with a footstep planner and a controller capable of adaptively adjusting the height of swing leg trajectories. The resulting integrated perception-planning-action system has allowed an HRP-2 humanoid robot to successfully and rapidly localize, approach and climb stairs, as well as to avoid obstacles during walking.


international conference on robotics and automation | 2006

Online environment reconstruction for biped navigation

Philipp Michel; Joel E. Chestnutt; Satoshi Kagami; Koichi Nishiwaki; James J. Kuffner; Takeo Kanade

As navigation autonomy becomes an increasingly important research topic for biped humanoid robots, efficient approaches to perception and mapping that are suited to the unique characteristics of humanoids and their typical operating environments are required. This paper presents a system for online environment reconstruction that utilizes both external sensors for global localization, and on-body sensors for detailed local mapping. An external optical motion capture system is used to accurately localize on-board sensors that integrate successive 2D views of a calibrated camera and range measurements from a SwissRanger SR-2 time-of-flight sensor to construct global environment maps in real-time. Environment obstacle geometry is encoded in 2D occupancy grids and 2.5D height maps for navigation planning. We present an on-body implementation for the HRP-2 humanoid robot that, combined with a footstep planner, enables the robot to autonomously traverse dynamic environments containing unpredictably moving obstacles


international conference on robotics and automation | 2006

An intelligent joystick for biped control

Joel E. Chestnutt; Philipp Michel; Koichi Nishiwaki; James J. Kuffner; Satoshi Kagami

We present the concept of an intelligent joystick, an architecture which provides simple and intuitive high-level directional control of a legged robot while adjusting the actual foot placements autonomously to avoid stepping in undesirable places. The general concept can be likened to riding a horse: high-level commands are provided, while the intelligence of the underlying system selects proper foot placements with respect to the shape and properties of the underlying terrain and overall balance considerations. We demonstrate a prototype system used for realtime control of the humanoid robot HRP-2


international conference on robotics and automation | 2003

Vision-based 2.5D terrain modeling for humanoid locomotion

Satoshi Kagami; Koichi Nishiwaki; James J. Kuffner; Kei Okada; Masayuki Inaba; Hirochika Inoue

We present an integrated humanoid locomotion and online terrain modeling system using stereo vision. From a 3D depth map, a 2.5D probabilistic description of the nearby terrain is generated. The depth map is calculated from a pair of stereo camera images, correlation-based localization is performed, and candidate planar walking surfaces are extracted. The results are used to update a probabilistic map of the terrain, which is input to an online footstep planning system. Experimental results are shown using the humanoid robot H7, which was designed as a research platform for intelligent humanoid robotics.


The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) | 2008

2P1-G09 GPU-accelerated Real-Time 3D Tracking for Humanoid Autonomy

Philipp Michel; Joel E. Chestnutt; Satoshi Kagami; Koichi Nishiwaki; James J. Kuffner; Takeo Kanade

We have accelerated a robust model-based 3D tracking system by programmable graphics hardware to run online at frame-rate during operation of a humanoid robot and to efficiently auto-initialize. The tracker recovers the full 6 degree-of-freedom pose of viewable objects relative to the robot. Leveraging the computational resources of the GPU for perception has enabled us to increase our tracker’s robustness to the significant camera displacement and camera shake typically encountered during humanoid navigation. We have combined our approach with a footstep planner and a controller capable of adaptively adjusting the height of swing leg trajectories. The resulting integrated perception-planning-action system has allowed an HRP-2 humanoid robot to successfully and rapidly localize, approach and climb stairs, as well as to avoid obstacles during walking.


Archive | 2007

GPU-accelerated Real-Time 3D Tracking for Humanoid Autonomy

Philipp Michel; Joel E. Chestnutt; Satoshi Kagami; Koichi Nishiwaki; James J. Kuffner; Takeo Kanadeyz


Journal of the Robotics Society of Japan | 2011

Robust Walking Control of Humanoids on Unknown Rough Terrain via Short Cycle Motion Generation Using Absolute Position Estimates

Koichi Nishiwaki; Satoshi Kagami


Journal of the Robotics Society of Japan | 2012

Online Walking Pattern Generation for Bipedal Robots that Considers Future Permissible Region of ZMP

Koichi Nishiwaki; Satoshi Kagami


The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) | 2011

2P2-J02 Online Walking Pattern Generation for a Humanoid that uses Estimated Actual Velocity of the Robot(Humanoid)

Koichi Nishiwaki; Satoshi Kagami


The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) | 2009

1A1-C20 3次元モデルのSIFT特徴点を用いた物体の発見および位置姿勢推定手法とGPUによる高速化

Takahiro Nakada; Michel Philipp; Joel E. Chestnutt; Yasuhide Fukushima; Yoshitaka Yamauchi; Koichi Nishiwaki; Satoshi Kagami; Takeo Kanade; Hiroshi Mizoguchi

Collaboration


Dive into the Koichi Nishiwaki's collaboration.

Top Co-Authors

Avatar

Satoshi Kagami

Tokyo University of Science

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joel E. Chestnutt

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Philipp Michel

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Hirochika Inoue

Japan Society for the Promotion of Science

View shared research outputs
Top Co-Authors

Avatar

Takeo Kanade

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Hiroshi Mizoguchi

Tokyo University of Science

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

J. Chestnut

Carnegie Mellon University

View shared research outputs
Researchain Logo
Decentralizing Knowledge