Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Susumu Tachi is active.

Publication


Featured researches published by Susumu Tachi.


IEEE Computer Graphics and Applications | 2005

Vision-based sensor for real-time measuring of surface traction fields

Kazuto Kamiyama; Kevin Vlack; T. Mizota; Hiroyuki Kajimoto; K. Kawakami; Susumu Tachi

The desire to reproduce and expand the human senses drives innovations in sensor technology. Conversely, human-interface research aims to allow people to interact with machines as if they were natural objects in a cybernetic, human-oriented way. We wish to unite the two paradigms with a haptic sensor as versatile as the sense of touch and developed for a dual purpose: to improve the robotic capability to interact with the physical world, and to improve the human capability to interact with the virtual world for emerging applications with a heightened sense of presence. We designed a sensor, dubbed GelForce, that acts as a practical tool in both conventional and novel desktop applications using common consumer hardware. By measuring a surface traction field, the GelForce tactile sensor can represent the magnitude and direction of force applied to the skins surface using computer vision. This article is available with a short video documentary on CD-ROM.


international conference on computer graphics and interactive techniques | 2007

Gravity grabber: wearable haptic display to present virtual mass sensation

Kouta Minamizawa; Souichiro Fukamachi; Hiroyuki Kajimoto; Naoki Kawakami; Susumu Tachi

We propose a wearable haptic display to present the weight sensation of a virtual object, which is based on our novel insight that the deformation on fingerpads makes a reliable weight sensation even when the proprioceptive sensation is absent. This device will provide a new form of ubiquitous haptic interaction.


Robotics and Autonomous Systems | 2004

Humanoid robotics platforms developed in HRP

Hirohisa Hirukawa; Fumio Kanehiro; Kenji Kaneko; Shuuji Kajita; Kiyoshi Fujiwara; Yoshihiro Kawai; Fumiaki Tomita; Shigeoki Hirai; Kazuo Tanie; Takakatsu Isozumi; Kazuhiko Akachi; Toshikazu Kawasaki; Shigehiko Ota; Kazuhiko Yokoyama; Hiroyuki Handa; Yutaro Fukase; Junichiro Maeda; Yoshihiko Nakamura; Susumu Tachi; Hirochika Inoue

Abstract This paper presents humanoid robotics platform that consists of a humanoid robot and an open architecture software platform developed in METI’s Humanoid Robotics Project (HRP). The final version of the robot, called HRP-2, has 1540 mm height, 58 kg weight and 30 degrees of the freedom. The software platform includes a dynamics simulator and motion controllers of the robot for biped locomotion, falling and getting up motions. The platform has been used to develop various applications and is expected to initiate more humanoid robotics research.


human factors in computing systems | 2001

RobotPHONE: RUI for interpersonal communication

Dairoku Sekiguchi; Masahiko Inami; Susumu Tachi

RobotPHONE is a Robotic User Interface (RUI) that uses robots as physical avatars for interpersonal communication. Using RobotPHONE, users in remote locations can communicate shapes and motion with each other. In this paper we present the concept of RobotPHONE, and describe implementations of two prototypes.


international conference on robotics and automation | 2005

An Encounter-Type Multi-Fingered Master Hand Using Circuitous Joints

Shuhei Nakagawara; Hiroyuki Kajimoto; Naoki Kawakami; Susumu Tachi; Ichiro Kawabuchi

We have developed a new type of master hand to control a dexterous slave robot hand for telexistence. Our master hand has two features. One is a compact exoskeleton mechanism called “circuitous joint,” which covers wide workspace of an operator’s finger. Another is the encounter-type force feedback that enables unconstrained motion of the operator’s finger and natural contact sensation. In this paper, the mechanism and control method of our master hand are introduced and experimental master-slave finger control is conducted.


international conference on robotics and automation | 1993

Dynamic control of a manipulator with passive joints in operational space

Hirohiko Arai; Kazuo Tanie; Susumu Tachi

A method for controlling a manipulator with passive joints, which have no actuators, in operational space is presented. The equation of motion is described in terms of operational coordinates. The coordinates are separated into active and passive components. The acceleration of the active components can be arbitrarily adjusted by using the coupling characteristics of manipulator dynamics. This method is also extended to path tracking control of a manipulator with passive joints. A desired path is geometrically specified in operational space. The position of the manipulator is controlled to follow the path. In this method, a path coordinate system based on the path is defined in operational space. The path coordinates consist of a component parallel to the path and components normal to the path. The acceleration of the components normal to the path is controlled according to feedback based on tracking error by using the dynamic coupling among the components. This in turn keeps the manipulator on the path. The effectiveness of the method is verified by experiments using a two-degree-of-freedom manipulator with a passive joint. >


robot and human interactive communication | 2000

A tactile display using surface acoustic wave

Masaya Takasaki; Takaaki Nara; Susumu Tachi; Toshiro Higuchi

We propose a novel method to provide human tactile sensation using surface acoustic wave (SAW). A pulse modulated driving voltage excites temporal distribution of shear force or friction shift on the surface of SAW substrates. The force/friction distribution can be perceived as tactile sensation at mechanoreceptors in the finger skin. A first prototype could express roughness successfully.


symposium on haptic interfaces for virtual environment and teleoperator systems | 2007

A Wearable Haptic Display to Present the Gravity Sensation - Preliminary Observations and Device Design

Kouta Minamizawa; Hiroyuki Kajimoto; Naoki Kawakami; Susumu Tachi

We propose a wearable, ungrounded haptic display that presents the realistic gravity sensation of a virtual object. We focused on the shearing stress on the fingerpads duo to the weight of the object, and found that the deformation of the fingerpads can generate the reliable gravity sensation even when the proprioceptive sensation on the wrist or arm is absent. This implies that a non-grounded gravity display can be realized by reproducing the fingerpad deformation. According to our observations, we had evaluation tests for device design. We implemented the prototype device which has simple structure using dual motors, and then evaluated the recognition ability of the gravity sensation presented on operators fingerpads with this method


ieee virtual reality conference | 2002

The SmartTool: a system for augmented reality of haptics

Takuya Nojima; Dairoku Sekiguchi; Masahiko Inami; Susumu Tachi

Previous research on augmented reality has been mainly focused on augmentation of visual or acoustic information. However, humans can receive information not only through vision and acoustics, but also through haptics. Haptic sensation is very intuitive, and some researchers are focusing on making use of haptics in augmented reality systems. While most previous research on haptics is based on static data, such as that generated from CAD, CT, and so on, these systems have difficulty responding to a changing real environment in real time. In this paper, we propose a new concept for the augmented reality of haptics, the SmartTool. The SmartTool responds to the real environment by using real time sensor(s) and a haptic display. The sensor(s) on the SmartTool measure the real environment then send us that information through haptic sensation. Furthermore, we will describe the prototype system we have developed.


IEEE Transactions on Haptics | 2010

Finger-Shaped GelForce: Sensor for Measuring Surface Traction Fields for Robotic Hand

Katsunari Sato; Kazuto Kamiyama; Naoki Kawakami; Susumu Tachi

It is believed that the use of haptic sensors to measure the magnitude, direction, and distribution of a force will enable a robotic hand to perform dexterous operations. Therefore, we develop a new type of finger-shaped haptic sensor using GelForce technology. GelForce is a vision-based sensor that can be used to measure the distribution of force vectors, or surface traction fields. The simple structure of the GelForce enables us to develop a compact finger-shaped GelForce for the robotic hand. GelForce that is developed on the basis of an elastic theory can be used to calculate surface traction fields using a conversion equation. However, this conversion equation cannot be analytically solved when the elastic body of the sensor has a complicated shape such as the shape of a finger. Therefore, we propose an observational method and construct a prototype of the finger-shaped GelForce. By using this prototype, we evaluate the basic performance of the finger-shaped GelForce. Then, we conduct a field test by performing grasping operations using a robotic hand. The results of this test show that using the observational method, the finger-shaped GelForce can be successfully used in a robotic hand.

Collaboration


Dive into the Susumu Tachi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hiroyuki Kajimoto

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Eimei Oyama

National Institute of Advanced Industrial Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dzmitry Tsetserukou

Toyohashi University of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge