Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Rui Fukui is active.

Publication


Featured researches published by Rui Fukui.


intelligent robots and systems | 2002

High resolution pressure sensor distributed floor for future human-robot symbiosis environments

Hiroshi Morishita; Rui Fukui; Tomomasa Sato

This paper proposes a high resolution sensor floor which can detect both humans and robots simultaneously. Each sensor floor unit is 500mm square and is equipped with 4,096 pressure switches distributed in a 64 /spl times/ 64 array. A 2m by 2m sensor floor with 16 of these sensor floor units has been realized. Experiments with this sensor floor have determined successfully the positions of a human and a 4-wheeled cart and can distinguish between them. The distinction is easily achieved because of the high resolution of the floor. The modular structure of the sensor floor enables easy application to a real room of irregular shape, allowing unconstrained measurement of the locations of humans, robots and objects in the room. Consequently, sensor floor systems will be essential components of future human-robot symbiosis systems to assist humans in daily life, and will also play important roles in medical and welfare robot systems.


ubiquitous computing | 2011

Hand shape classification with a wrist contour sensor: development of a prototype device

Rui Fukui; Masahiko Watanabe; Tomoaki Gyota; Masamichi Shimosaka; Tomomasa Sato

In this paper, we describe a novel sensor device which recognizes hand shapes using wrist contours. Although hand shapes can express various meanings with small gestures, utilization of hand shapes as an interface is rare in domestic use. That is because a concise recognition method has not been established. To recognize hand shapes anywhere with no stress on the user, we developed a wearable wrist contour sensor device and a recognition system. In the system, features, such as sum of gaps, were extracted from wrist contours. We conducted a classification test of eight hand shapes, and realized approximately 70% classification rate.


international conference on robotics and automation | 2012

Grasping by caging: A promising tool to deal with uncertainty

Weiwei Wan; Rui Fukui; Masamichi Shimosaka; Tomomasa Sato; Yasuo Kuniyoshi

This paper presents a novel approach to deal with uncertainty in grasping. The basic idea is to initiate a caging manipulation state and then shrink fingers into immobilization to perform a practical grasping. Thanks to flexibility from caging, this procedure is intrinsically safe and gains tolerance towards uncertainty. Besides, we demonstrate that the minimum caging is immobilization and consequently propose using three or four fingers to manipulate planar convex objects in a grasping-by-caging way. Experimental results with physical simulation show the robustness and efficacy of our approach. We expect its leading benefits in saving finger number, conquering low-friction materials and especially, dealing with pose/shape uncertainty.


international conference on robotics and automation | 2013

A new “grasping by caging” solution by using eigen-shapes and space mapping

Weiwei Wan; Rui Fukui; Masamichi Shimosaka; Tomomasa Sato; Yasuo Kuniyoshi

“Grasping by caging” has been considered as a powerful tool to deal with uncertainty. In this paper, we continue to explore into “grasping by caging” and propose a new solution by using eigen-shapes and space mapping. For one thing, eigen-shapes fix dexterous hands into a series of finger formations and help to reduce dimensionality and computational complexity. For the other, space mapping builds a mapping between rasterized grids in 2-D Work space (W space) and rasterized voxels in 3-D Configuration space (C space) and helps to rapidly reconstruct C space so that we can efficiently measure the robustness of caging and find an optimal caging configuration for grasping. Our algorithm can work rapidly and squeezingly cage any 2-D shapes, including objects with either convex boundaries, concave boundaries, 1-order or high-order boundaries and even objects with inner holes. We implement the algorithm with MATLAB and carry out experiments with WEBOTS simulation to test its robustness to uncertainties. The results show that our algorithm can work well with various object shapes and can be robust to noisy control and noisy perception. It is promising in the power grasping tasks of dexterous hands.


international conference on pervasive computing | 2012

A unified framework for modeling and predicting going-out behavior

Shoji Tominaga; Masamichi Shimosaka; Rui Fukui; Tomomasa Sato

Living in society, to go out is almost inevitable for healthy life. There is increasing attention to it in many fields, including pervasive computing, medical science, etc. There are various factors affecting the daily going-out behavior such as the day of the week, the condition of ones health, and weather. We assume that a person has ones own rhythm or patterns of going out as a result of the factors. In this paper, we propose a non-parametric clustering method to extract ones rhythm of the daily going-out behavior and a prediction method of ones future presence using the extracted models. We collect time histories of going out/coming home (6 subjects, total 827 days). Experimental results show that our method copes with the complexity of patterns and flexibly adapts to unknown observation.


intelligent robots and systems | 2010

Moving objects detection and classification based on trajectories of LRF scan data on a grid map

Taketoshi Mori; Takahiro Sato; Hiroshi Noguchi; Masamichi Shimosaka; Rui Fukui; Tomomasa Sato

Laser based environment recognition technologies have been developed recently. Especially moving objects detection and classification by laser scanners mounted on a mobility is required for mobile robots and autonomous cars. In this paper, we propose a moving objects detection and classification method based on grid trajectories acquired from sequential laser scan data. Grid trajectories are obtained by voting sequential laser scan points on a grid map, and these trajectories not only work for a correct scan segmentation, but also represent the size and the speed of moving objects. We classify a moving object into either a person, a group of people, a bike, a car based on its grid trajectory. In our experiments, our mobility mounted laser scanners acquired scan data in the university campus, and the experimental results illustrate the effectiveness of the proposed method in outdoor environments.


intelligent robots and systems | 2008

Development of a home-use automated container storage/retrieval system

Rui Fukui; Hiroshi Morishita; Taketoshi Mori; Tomomasa Sato

This paper describes development of home-use automated container storage/retrieval system, a sub-system of a logistical support robot system in living space. The s/r system has 3 features. (1) Elevator type structure enables not only auto stocking motion, but also human access to containers as if the auto rack is a general shelf. (2) A separated motion layout enables us to utilize upper ceiling space as a stock space, and decreases possibility of accidental stuck by reducing the volume of robotpsilas motion. (3) The collaboration of a container guide plate and sensing with RFID actualize to navigate people to place a container on a shelf with desired position and posture. By experiments, it is confirmed that the ACSRS can transfer a container and 5 [kg] load (total 8 [kg]). We summarize significant technical points to design a system of this type.


international conference on robotics and automation | 2015

Improving regrasp algorithms to analyze the utility of work surfaces in a workcell

Weiwei Wan; Matthew T. Mason; Rui Fukui; Yasuo Kuniyoshi

The goal of this paper is to develop a regrasp planning algorithm general enough to perform statistical analysis with thousands of experiments and arbitrary mesh models. We focus on pick-and-place regrasp which reorients an object from one placement to another by using a sequence of pick-ups and place-downs. We improve the pick-and-place regrasp approach developed in 1990s and analyze its performance in robotic assembly with different work surfaces in the workcell. Our algorithm will automatically compute the stable placements of an object, find several force-closure grasps, generate a graph of regrasp actions, and search for regrasp sequences. We demonstrate the advantages of our algorithm with various mesh models and use the algorithm to evaluate the completeness, the cost and the length of regrasp sequences with different mesh models and different assembly tasks in the presence of different work surfaces. Our results show that spare work surfaces are beneficial to assembly. Tilted work surfaces are only sometimes beneficial, depending on the objects.


Advanced Robotics | 2015

Hand shape classification in various pronation angles using a wearable wrist contour sensor

Rui Fukui; Masahiko Watanabe; Masamichi Shimosaka; Tomomasa Sato

Hand gestures are potentially useful for communications between humans and between a human and a machine. However, existing methods entail several problems for practical use. We have proposed an approach to hand shape recognition based on wrist contour measurement. Especially in this paper, two assignments are addressed. First is the development of a new sensing device in which all elements are installed in a wrist-watch-type device. Second is the development of a new hand shape classifier that can accommodate pronation angle changes. The developed sensing device enables wrist contour data collection under conditions in which the pronation angle varies. The classifier recognizes the hand shape based on statistics produced through data forming and statistics conversion processes. The most important result is that no large difference exists between classification rates that include or those that exclude the independent (preliminary) pronation estimation process using inertia measurement units. This result shows two possible insights: (1) the wrist contour has some features that depend on the hand shape but which do not depend on the pronation angle, or (2) the wrist contour potentially includes pronation angle variation information. These insights indicate the possibility that hand shape can be recognized solely from the wrist contour, even while changing the pronation angle. Graphical Abstract


Advanced Robotics | 2013

Development of wrist contour measuring device for an interface using hand shape recognition

Rui Fukui; Masahiko Watanabe; Masamichi Shimosaka; Tomomasa Sato

Recently, gesture recognition is widely used as interface. Popular gestures are mainly arm motion and whole body motion. Although hand shape is a good sign that can express rich information with small motions, few applications are in practical use. That is because the existing methods have several problems: blocks of finger sense and interference with finger motion, restrictions of hand position and posture, and complex initial configurations. In this study, we try to recognize hand shapes by observing the wrist contour, which varies with finger motions. We have developed a robust wrist-watch-type device that captures wrist contour, and have collected data from a substantial number of subjects. With the collected data, we conduct hand shape recognition experiments in several conditions. To overcome the positioning deviations and individual differences, two feature types are designed. Through the experiment, potential of the features is confirmed, and some effective features are picked up. In addition, concerning the design of recognition target properties, we examine the number of target hand shapes and the combination of hand shapes through the experiment, and several clues for target design are revealed.

Collaboration


Dive into the Rui Fukui's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Taketoshi Mori

Aoyama Gakuin University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge