Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mitsuharu Kojima is active.

Publication


Featured researches published by Mitsuharu Kojima.


Proceedings of the IEEE | 2012

Home-Assistant Robot for an Aging Society

Kimitoshi Yamazaki; Ryohei Ueda; Shunichi Nozawa; Mitsuharu Kojima; Kei Okada; Kiyoshi Matsumoto; Masaru Ishikawa; Isao Shimoyama; Masayuki Inaba

Many countries around the world face three major issues associated with their aging societies: a declining population, an increasing proportion of seniors, and an increasing number of single-person households. To explore assistive technologies that can help solve the problems faced by aging societies, we have tested several information and robot technologies. This paper introduces research on a home-assistant robot, which improves the ease and productivity of home activities. For people who work hard outside the home, the assistant robot performs chores in their home environment while they are away. A case study of a life-sized robot with a humanlike functional body performing daily chores is presented. An integrated software system incorporating modeling, recognition, and manipulation skills, as well as a motion generation approach based on the software system, is explained. Moreover, because housekeepers perform chores one after another in their daily environment, we also aim to develop a system for continuously performing a series of tasks by including failure detection and recovery.


ieee-ras international conference on humanoid robots | 2006

Vision based behavior verification system of humanoid robot for daily environment tasks

Kei Okada; Mitsuharu Kojima; Yuuichi Sagawa; Toshiyuki Ichino; Kenji Sato; Masayuki Inaba

This paper describes integrated/intelligent humanoid robot system for daily-life environment tasks. We have realized complex behaviors of a humanoid robot in daily-life environment based on motion planner technique using an environment and manipulation knowledge. However in order to adapt to unknown or dynamic situations, sensor based behavior variation is essentially important. In this paper, we present a design and implementation of sensor based behavior verification system using an environment and manipulation knowledge, which is also used in manipulation motion planner. We also present software architecture that allows us to write a single stream code to perform complex concurrent humanoid motions. By using this architecture, sensor based verification functions are easily integrated in motion generation functions. Finally, we demonstrated a water-pouring task and a dishwashing task of the life-sized humanoid robot HRP2-JSK in a real environment while verifying its own motion


intelligent robots and systems | 2007

Multi-cue 3D object recognition in knowledge-based vision-guided humanoid robot system

Kei Okada; Mitsuharu Kojima; Satoru Tokutsu; Toshiaki Maki; Yuto Mori; Masayuki Inaba

A vision based object recognition subsystem on knowledge-based humanoid robot system is presented. Humanoid robot system for real world service application must integrate an object recognition subsystem and a motion planning subsystem in both mobility and manipulation tasks. These requirements involve the vision system capable of self-localization for navigation tasks and object recognition for manipulation tasks, while communicating with the motion planning subsystem. In this paper, we describe a design and implementation of knowledge based visual 3D object recognition system with multi-cue integration using particle filter technique. The particle filter provides very robust object recognition performance and knowledge based approach enables robot to perform both object localization and self localization with movable/fixed information. Since this object recognition subsystem share knowledge with a motion planning subsystem, we are able to generate vision-guided humanoid behaviors without considering visual processing functions. Finally, in order to demonstrate the generality of the system, we demonstrated several vision-based humanoid behavior experiments in a daily life environment.


intelligent robots and systems | 2008

Task guided attention control and visual verification in tea serving by the daily assistive humanoid HRP2JSK

Kei Okada; Mitsuharu Kojima; Satoru Tokutsu; Yuto Mori; Toshiaki Maki; Masayuki Inaba

This paper describes daily assistive task experiments that conducting on the HRP2JSK humanoid robot. We present overall action and recognition integrated system design to realize daily assistive behaviors autonomously and robustly, along with the demonstration that the HRP2JSK pours tea from a bottle to a cup and wash it after human drink it. To obtain autonomy and robustness, visual recognition and behavior control through perception information are important.


intelligent robots and systems | 2008

Wheelchair support by a humanoid through integrating environment recognition, whole-body control and human-interface behind the user

Shunichi Nozawa; Toshiaki Maki; Mitsuharu Kojima; Shigeru Kanzaki; Kei Okada; Masayuki Inaba

In this paper, we treat with wheelchair support by a life-sized humanoid robot. It is quite essential to integrate whole-body motion, recognition of environment and human-interface behind the user in order to achieve this task. Contributions of this paper is whole-body control including pushing motion using the offset of the ZMP and observation of the attitude outlier, recognition of the wheelchair using particle filter and human-interface behind the person using face detection and recognition of gesture.


intelligent robots and systems | 2008

Manipulation and recognition of objects incorporating joints by a humanoid robot for daily assistive tasks

Mitsuharu Kojima; Kei Okada; Masayuki Inaba

Methods for a daily assistive humanoid robot to manipulate and recognize the objects incorporating joints and learn the manipulation knowledge are presented. It is necessary for humanoid robots to use the objects incorporating joints such as some furniture and tools to provide daily assistance. We have been tried to make an integrated humanoid robots recognition and manipulation system of the objects and tools in the real world. We extend the system for the objects incorporating joints. In this paper, a recognition system in which the robots recognizes the objects incorporating joints by the visual 3D object recognition method with multi-cue integration using particle filter technique and a manipulation system of them are shown. The search areas of the joints are automatically generated based on the manipulation knowledge. We present three key techniques to recognize and manipulate the objects incorporating rotational and linear joints. 1) Knowledge description for manipulation and recognition of these objects; 2) Motion planning method to manipulate them; and 3) Recognition method of them closely related to the manipulation knowledge. Moreover, a method for a person to teach the handle, one of manipulation knowledge, visually to the robot is shown. Finally, a daily assistive task experiment in the real world using these elements is shown.


international conference on mechatronics and automation | 2011

End point tracking for a moving object with several attention regions by composite vision system

Kotaro Nagahama; Tomohiro Nishino; Mitsuharu Kojima; Kimitoshi Yamazaki; Kei Okada; Masayuki Inaba

This paper describes an approach of multi-target tracking for gaze control to know the motions of end points on a moving object. In order to track several moving parts from image streams, three different types of tracker to observe temporal, spatial and appearance changes are combined. Also, we developed a composite vision system on which two wide-angle cameras and two zoom-enabled cameras are mounted. We tested the gaze control system and the head system by observing a human working in daily environment. This results showed the effectiveness of our approach.


Archive | 2011

Enhanced Mother Environment with Humanoid Specialization in IRT Robot Systems

Masayuki Inaba; Key Okada; Tomoaki Yoshikai; Ryo Hanai; Kimitoshi Yamazaki; Yuto Nakanishi; Hiroaki Yaguchi; Naotaka Hatao; Junya Fujimoto; Mitsuharu Kojima; Satoru Tokutsu; Kunihiko Yamamoto; Yohei Kakiuchi; Toshiaki Maki; Ryohei Ueda; Ikuo Mizuuchi

In the research to realize high standard task-oriented assistant robots, a general and strategic way of development is essential. Otherwise high functionality and potential for evolution of those robots cannot be achieved. Robotic systems are socially expected to assist our daily life in many situations. As a result, projects related to those robots are becoming large, involving many researchers and engineers of universities and companies. This motivated us a new strategy to construct robotic systems based on mother environment and humanoid specialization to keep developing and refining functional elements of robots in an evolutionary way. The mother environment is an entity that creates brains of humanoid robots, where various robotics function elements, libraries, middle-wares and other research tools are integrated. Then the brain of each robot is developed utilizing the functional elements in the mother. We call this process specialization of a humanoid. To enhance this specialization process, we introduce a generator, which realizes conversion of functions in the mother environment for the real-time layer. After the research of these specific robots, enhanced robotics functions are incorporated into the mother again. We call this process feedback. In this chapter, we present these ideas using concrete implementation examples in IRT projects[1], where several robots to assist our daily life are developed.


Advanced Robotics | 2009

Integrating Recognition and Action Through Task-Relevant Knowledge for Daily Assistive Humanoids

Kei Okada; Mitsuharu Kojima; Satoru Tokutsu; Yuto Mori; Toshiaki Maki; Masayuki Inaba


ROBOMECH Journal | 2014

Development of 3D viewer based teleoperation interface for Human Support Robot HSR

Hiroaki Yaguchi; Kenji Sato; Mitsuharu Kojima; Kiyohiro Sogen; Yutaka Takaoka; Masayoshi Tsuchinaga; Takashi Yamamoto; Masayuki Inaba

Collaboration


Dive into the Mitsuharu Kojima's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge