Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yue Bao is active.

Publication


Featured researches published by Yue Bao.


ieee intelligent transportation systems | 1997

Motion estimation from sequential image using correlation

In Soo Kweon; Yue Bao; Naofumi Fujiwara

In this paper, we take aim at a flexible and active vehicle following system for autonomous vehicles. To realize the system, we use a license plate instead of the devices that receive and send signals. In order to gather information from a license plate, we propose three methods; one that continuously tracks a license plate, one that extracts the characters under conditions of changeable and uneven illumination, and one that calibrates the difference between the reference template image and the image selected by correlation. From these characters in a license plate, we can determine the relative distance and angle. The effectiveness of this system is shown by the experimental results.


Advanced Robotics | 1997

Feed-forward multilayer neural network model for vehicle lateral guidance control

Guowen Wang; Naofumi Fujiwara; Yue Bao

An advanced vehicle lateral guidance control technology is necessary in order to develop intelligent transportation and manufacturing systems with flexibility and immediate adaptability. PID control, optimal control, and fuzzy control have often been used for designing a vehicle lateral guidance controller; in addition automatic guidance methods by spline curve and inverse dynamics are also used in mobile robots (e.g. differential drive), but they are not sufficient to develop a highly intelligent vehicle lateral guidance controller which can adapt to varying environments, because they lack some behavior like learning ability and adaptability. In this paper, the possibility to apply neural networks for developing a vehicle lateral guidance controller is exposed. A new neuron activation function suitable for vehicle lateral guidance control is suggested, a feed-forward multilayer neural network (FMNN) with the suggested neuron activation function is proposed and a vehicle lateral guidance controller (VLGC)...


Archive | 2011

Autonomous Flight Control for RC Helicopter Using a Wireless Camera

Yue Bao; Syuhei Saito; Yutaro Koya

In recent years, there are a lot of researches on the subject of autonomous flight control of a micro radio control helicopter. Some of them are about flight control of unmanned helicopter (Sugeno et al., 1996) (Nakamura et al., 2001). The approach using the fuzzy control system which consists of IF-Then control rules is satisfying the requirements for the flight control performance of an unmanned helicopter like hovering, takeoff, rotating, and landing. It is necessary to presume three dimensional position and posture of micro RC helicopter for the autonomous flight control. A position and posture presumption method for the autonomous flight control of the RC helicopter using GPS (Global Positioning System), IMU (Inertial Measurement Unit), Laser Range Finder (Amida et al., 1998), and the image processing, etc. had been proposed. However, the method using GPS cannot be used at the place which cannot receive the electric waves from satellites. Therefore, it is a problem that it cannot be used for the flight control in a room. Although the method which uses various sensors, such as IMU and Laser Range Finder, can be used indoors, you have to arrange many expensive sensors or receivers in the room beforehand. So, these methods are not efficient. On the other hand, the method using an image inputted by a camera can be used in not only outdoors but also indoors, and is low price. However, this method needs to install many artificial markers in the surroundings, and it is a problem that the speed of the image inputting and image processing cannot catch up the speed of movement or vibration of a RC helicopter. A method presuming the three dimensional position and posture of a RC helicopter by the stereo measurement with two or more cameras installed in the ground was also proposed. In this case, the moving range of the RC helicopter is limited in the place where two or more cameras are installed. Moreover, there is a problem for which a high resolution camera must be used to cover a whole moving range. (Ohtake et al., 2009) Authors are studying an autonomous flight of a RC helicopter with a small-wireless camera and a simple artificial marker which is set on the ground. This method doesn’t need to set the expensive sensors, receivers, and cameras in the flight environment. And, we thought that a more wide-ranging flight is possible if the natural feature points are detected from the image obtained by the camera on the RC helicopter. This chapter contains the following contents.


international conference on control, automation, robotics and vision | 2008

Log-Polar transform in 3D environment

Takayuki Nakata; Yue Bao

Log-Polar transform is mentioned as one of the models of the central-fovea visual sensor in a creature sight. Up to now, in the field of image recognition, Log-Polar transform is known widely and is especially used well in recognition of face images. However, since it is the recognition method only to a scaling and a rotation, it mainly remains in application in a 2D space. In this paper, a new Log-Polar transform algorithm which can detect even affine transform parameters is proposed, and this algorithm is expand so that perspective transform invariant can be detected. By using this transform, the pattern recognition within 3D space is possible.


Transactions of the Japan Society of Mechanical Engineers. C | 2000

A Six Degree of Freedom Vehicle Measurement Scanner With Fan-Shaped Laser Beam.

Yue Bao; Kenta Kobayashi; Naofumi Fujiwara

This paper proposes a 3D laser motion measurement system. This system consistes of a laser scanner mounted on a vehicle and retro-reflections(corner cube)placed in the environment as landmarks. The laser scanner rotates a fan-shaped laser beam to detect the retro-reflections and measures their relative directions. Using the information of the directions, we can get three dimensional position and posture of the vehicle in real time.


Transactions of the Japan Society of Mechanical Engineers. C | 1999

Automatic guidance for a Vehicle using Spline Curve.

Akira Suzuki; Yue Bao; Naofumi Fujiwara; Masumi Kano; Tetsuya Hirano

In this paper, a new method of Guidance Control System for the Moving Robot is described. As a method of layout Planning to navigate the robot, Trajectory Design using B-spline curve is proposed. It has at continuous curvature, and passes near to the given points on a 2D space by adding appropriate new points. The Velocity Data is calculated from B-spline curve in order to navigate the PWS robot, and authors suggest Servo Mechanism which make robot track the trajectory accurately. The results of simulation and experiment show the effectiveness of this method.


ieee intelligent transportation systems | 1997

A method of location estimating for a vehicle by using image processing

Yue Bao; Naofumi Fujiwara; Zhiheng Jiang

A method was proposed for finding the position and direction of vehicles by processing the images of flat landmarks. For multiple landmarks set in a wide area, the vehicle can run in an arbitrary space. Since a low locating accuracy was found at the right frontal area of the flat landmark, a cubic landmark is proposed here to improve the locating accuracy, and the experimental result is reported here.


Transactions of the Japan Society of Mechanical Engineers. C | 1999

Auto-Tracking of Vehicle by Flexible Target Point Following Algorithm.

Naoki Suganuma; Kazufumi Matukawa; Hajime Togiya; Yue Bao; Naofumi Fujiwara


Systems and Computers in Japan | 2007

A 3D log-polar transform for pattern recognition

Takayuki Nakata; Yue Bao; Naofumi Fujiwara


Transactions of the Japan Society of Mechanical Engineers. C | 1998

Dynamic Detection of a License Plate Using Neural Network.

Naoki Suganuma; Soo In Kweon; Yue Bao; Naofumi Fujiwara

Collaboration


Dive into the Yue Bao's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Takayuki Nakata

Toyama Prefectural University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hironari Matsuda

Toyama Prefectural University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tayayuki Nakata

Toyama Prefectural University

View shared research outputs
Researchain Logo
Decentralizing Knowledge