Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yasushi Iwatani is active.

Publication


Featured researches published by Yasushi Iwatani.


Automatica | 2006

Stability tests and stabilization for piecewise linear systems based on poles and zeros of subsystems

Yasushi Iwatani; Shinji Hara

This paper provides several stability tests for piecewise linear systems and proposes a method of stabilization for bimodal systems. In particular, we derive an explicit and exact stability test for planar systems, which is given in terms of coefficients of transfer functions of subsystems. Restricting attention to the bimodal and planar case, we show simple stability tests. In addition, we drive a necessary stability condition and a sufficient stability condition for higher-order and bimodal systems. They are given in terms of the eigenvalue loci and the observability of subsystems. All the stability tests provided in this paper are computationally tractable, and our results are applied to the stabilizability problem. We confirm the exactness and effectiveness of our approach by illustrative examples.


Advanced Robotics | 2008

Image-Based Visual PID Control of a Micro Helicopter Using a Stationary Camera

Kei Watanabe; Yuta Yoshihata; Yasushi Iwatani; Koichi Hashimoto

This paper proposes an image-based visual servo control method for a micro helicopter. The helicopter does not have any sensors to measure its position or posture on the body. A stationary camera is placed on the ground, and it obtains image features of the helicopter. The differences between current features and given reference features are computed. PID controllers then make control input voltages for helicopter control, and they drive the helicopter. The proposed controller can avoid some major difficulties in computer vision such as numerical instability due to image noises or model uncertainties, since the reference is defined in the image frames. An experimental result demonstrates that the proposed controller can keep the helicopter in a stable hover.


american control conference | 2009

Fast sensor scheduling for spatially distributed heterogeneous sensors

Shogo Arai; Yasushi Iwatani; Koichi Hashimoto

This paper addresses a sensor scheduling problem for a class of networked sensor systems whose sensors are spatially distributed and measurements are influenced by state dependent noise. Sensor scheduling is required to achieve power saving since each sensor operates with a battery power source. A networked sensor system usually consists of a large number of sensors, but the sensors can be classified into a few different types. We therefore introduce a concept of sensor types in the sensor model to provide a fast and optimal sensor scheduling algorithm for a class of networked sensor systems, where the sensor scheduling problem is formulated as a model predictive control problem. The computation time of the proposed algorithm increases exponentially with the number of the sensor types, while that of standard algorithms is exponential in the number of the sensors. In addition, we propose a fast sensor scheduling algorithm for a general class of networked sensor systems by using a linear approximation of the sensor model.


conference on decision and control | 2008

Fast and optimal sensor scheduling for networked sensor systems

Shogo Arai; Yasushi Iwatani; Koichi Hashimoto

This paper addresses a sensor scheduling problem for a class of networked sensor systems whose sensors are spatially distributed and measurements are influenced by state dependent noise. Sensor scheduling is required to achieve power saving since each sensor operates with a battery power source. The scheduling problem is formulated as a model predictive control problem with single sensor measurement per time. It is assumed that all sensors have state dependent noise and have the same characteristics, which follows from the properties of networked sensor systems.We propose a fast and optimal sensor scheduling algorithm for a class of networked sensor systems. Computation time of the proposed algorithm is proportional to the number of sensors and does not depend on the prediction horizon. In addition, we provide a fast sensor scheduling algorithm for a general class of systems by using a linear approximation of the sensor model.


international conference on control applications | 1999

Analysis and design of running robots in touchdown phase

Takayuki Ikeda; Yasushi Iwatani; K. Suse; T. Mita

In order to create a running and jumping quadruped robot composed of all articular joints, we have developed a mono-leg robot which simulates the landing and lift-off motions of kangaroos. From the photograph data of the gait motion of a kangaroo, time responses of four fundamental variables are approximated by solutions of second or first order differential-equations. Then we proposed a control strategy of the robot which realizes these differential equations as controlled constraints. Experimental results show that a running mono-leg robot is produced which has a smooth jumping gaits.


intelligent robots and systems | 2007

Multi-camera visual servoing of a micro helicopter under occlusions

Yuta Yoshihata; Kei Watanabe; Yasushi Iwatani; Koichi Hashimoto

This paper proposes a switched visual feedback control method for a micro helicopter under occlusions. Two stationary cameras are placed on the ground. They track four black balls attached to rods connected to the bottom of the helicopter. The control input is computed by using the errors between the positions of the tracked objects and pre-specifled reference values for them. The multi-camera configuration is redundant for helicopter control, but it enables us to design a switched controller which is robust against occlusions. An occlusion occurs when an object moves across in front of a camera or when the background color happens to be similar to the color of a tracked object. Multi-camera systems are suitable for designing a robust controller under occlusions, since even when a tracked object is not visible in a camera view, other cameras may track it. The authors have proposed a camera selection approach: If an occlusion is detected in a camera view then the other camera is used to control the helicopter. This paper presents another switched visual feedback control method for a micro helicopter under occlusions. This is called the image feature selection approach. This paper assumes that at most one tracked object is occluded at each time to simplify notations, although we can draw general conclusions. The errors between the positions of the tracked objects and pre-specifled references are used to compute the control input, when all the tracked objects are visible. If one of the tracked objects is invisible, then the controller uses the errors given by the other three tracked objects. The position of the occluded object is also estimated by using the other three tracked objects.


intelligent robots and systems | 2008

A visual-servo-based assistant system for unmanned helicopter control

Kei Watanabe; Yasushi Iwatani; Kenichiro Nonaka; Koichi Hashimoto

This paper proposes an assistant and training system for controlling an unmanned helicopter. The unmanned helicopter does not have any sensors which measure its position or posture. Stationary cameras are placed on the flight field. The helicopter is controlled by using a visual servo technique as follows. An operator steers the helicopter using sticks on a hand-held input device. The sticks make reference signals. The assistant system designs control signals such that the helicopter tracks the reference signals. The proposed system has the following four functions: automatic takeoff and landing, control channel selection, flight in a desired area, and automatic motion generation. They enables beginners to control an unmanned helicopter. The system provides real actions of unmanned helicopters. This is the main difference from flight simulators.


robotics and biomimetics | 2011

Dependable takeoff and landing control of a small-scale helicopter with a wireless camera

Yuki Kubota; Yasushi Iwatani

This paper presents a takeoff and landing controller for a small-scale helicopter with a wireless camera. The camera is a single sensor for control of the helicopter, and it is a wireless device. The camera looks downward, and it takes an image of a specified mark on the ground every frame. Feedback controllers do not work well at a low altitude, since the camera does not take a correct image of the mark. To avoid this, the presented controller for takeoffs and landings consists of a combination of feedback controllers and feedforward controllers. Feedforward controllers are used, when the helicopter is near to the ground. When the helicopter is at a high altitude, the helicopter is controlled by a dependable feedback controller proposed by the authors. The dependable feedback controller decides whether each captured image is noisy or noise-free in real-time, since the wireless camera sometimes captures noisy images. If the current image is determined as noise-free, then the helicopter is controlled by a PD controller. If the current image is decided as noisy, then a feedforward controller is used. Noisy images are not used to make the control signal.


international conference on advanced robotics | 2011

Dependable visual servo control of a small-scale helicopter with a wireless camera

Yuki Kubota; Yasushi Iwatani

This paper presents a dependable visual controller for a small-scale helicopter with a wireless camera. The wireless camera sometimes captures noisy images caused by communication errors or swinging motion of the helicopter. If all captured images are used to generate feedback control signals, then the helicopter is out of control in our experiences. Noisy images are classified into three types. Three image features are introduced so as to find noisy images. Noisy images are not used to make control signals. The proposed dependable controller enables the helicopter to achieve long-time hovering flight.


international conference on robotics and automation | 2008

Visual tracking with occlusion handling for visual servo control

Yasushi Iwatani; Kei Watanabe; Koichi Hashimoto

This paper proposes a visual tracking method which is robust to occlusion. This paper also integrates the visual tracking method and visual servo control into a vision-based control method with occlusion handling. The proposed method chooses a set of correctly extracted image features, and it then obtains an estimate of all the image features from the correctly extracted image features. The estimation procedure makes it possible to track image features even when occlusion occurs. The method has low computational complexity, since the image Jacobian is used for the image feature selection and estimation. In addition, even when the controller fails to track a moving image feature, it can find the failed image feature without a global search over the entire image plane. As a result, it can track the failed image feature again quickly.

Collaboration


Dive into the Yasushi Iwatani's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Masato Ishikawa

Tokyo Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kaori Tsurui

University of the Ryukyus

View shared research outputs
Researchain Logo
Decentralizing Knowledge