Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Katsuhito Akahane is active.

Publication


Featured researches published by Katsuhito Akahane.


Transactions on edutainment I | 2008

A haptic virtual environment for molecular chemistry education

Makoto Sato; Xiangning Liu; Jun Murayama; Katsuhito Akahane; Masaharu Isshiki

We attempted to produce an environment for education. Subjects such as mathematics or sciences are usually studied on a desk in a classroom. Our research goal is to allow students to study scientific contents more viscerally than existing studying methods by haptic interaction. To construct the environment, we have to make a user-friendly haptic interface. The study is described in two parts. The first part is defining what a useful haptic interface is. In this part, we focused on the grip of a haptic interface. SPIDAR-G is a haptic interface, which is manipulated by a grip with 8 strings.. Grip size is an important parameter for usability. We have found an optimal sphere size through SPIDAR-G usability testing. The other part is defining how teachers can use the interactive system with haptic interaction as a teaching aid. In this part , we focused on the interaction between two water molecules. First, we constructed an environment to feel Van der Waals force as well as electrostatic force with haptic interaction. Then, we observe the effectiveness of the environment when used by a class of students.


international conference on artificial reality and telexistence | 2013

Development of string-based multi-finger haptic interface SPIDAR-MF

Lanhai Liu; Satoshi Miyake; Katsuhito Akahane; Makoto Sato

This article aims to introduce and explain the design of a new string-based multi-finger haptic interface named SPIDAR-MF(Space Interface Device for Artificial Reality Multi-Finger), which can display a 3 degree-of-freedom spatial force feedback for each finger of a human hand through 4 strings attached to each fingertip. By measuring the lengths of 4 strings attached to it individually, the SPIDAR-MF can calculate the position of each fingertip, and then give force feedbacks to each fingertip by using motors to adjust tensions of the strings. A rotary frame which motors are mounted on is proposed to solve the problem that strings may be twisted because of the movement caused by the fingers. The experimental results show that SPIDAR-MF is effective and feasible to provide a multi-finger haptic interaction environment with objects in the virtual world.


symposium on haptic interfaces for virtual environment and teleoperator systems | 2005

A development of high definition haptic controller

Katsuhito Akahane; Shoichi Hasegawa; Yasuharu Koike; Makoto Sato

This paper describes a haptic controller for force feedback systems in virtual environments. For stable haptic, a frequency of device control higher than I kHz is required. In this research, we have developed a high definition haptic controller with SuperH4 processor. In order to achieve high frequency of device control, the controller interpolates haptic information from primitive geometries and impedance of a virtual wall. SPIDAR, a wire driven haptic display, uses this controller. It realizes high control frequency. Consequently, it became possible to display a very hard virtual wall stably. The high definition haptic controller (HDHC) allows stable haptic and high quality sensation in virtual environments.


international conference on computer graphics and interactive techniques | 2015

Wearable 6-DoF wrist haptic device "SPIDAR-W"

Kazuki Nagai; Soma Tanoue; Katsuhito Akahane; Makoto Sato

By recent development of HMD technology, it becomes possible to provide a feeling as if his own intrudes into the VE. HMD is a wearable visual equipment used by being mounted on a human head, so user can retract into VE without seeing the outside of the information by blocking the outside world information and reflecting only VE image in front of users eye. Therefore, using HMD is possible to reproduce a roller coaster experience, freely exploration of virtual house, etc.


international conference on human haptic sensing and touch enabled computer applications | 2014

Development of Two-Handed Multi-finger Haptic Interface SPIDAR-10

Lanhai Liu; Satoshi Miyake; Naoki Maruyama; Katsuhito Akahane; Makoto Sato

This article describes the development of a wire-driven multi-finger haptic interface named SPIDAR-10(Space Interface Device for Artificial Reality), which can render a 3 degree-of-freedom spatial force feedback for human fingers through 4 wires attached to each fingertip. SPIDAR-10 enables users to manipulate virtual objects in a VR world with ten fingers of both hands. With two rotary cylindrical frames, which motors are mounted on, the interference of the wires can be reduced. A method of frame control is also proposed. The experimental results of the performance of the basic SPIDAR system and rotary frame are also given.


international conference on human haptic sensing and touch enabled computer applications | 2014

Vibration and Subsequent Collision Simulation of Finger and Object for Haptic Rendering

Shoichi Hasegawa; Yukinobu Takehana; Alfonso Balandra; Hironori Mitake; Katsuhito Akahane; Makoto Sato

Humans can discriminate object’s materials [5, 7, 9] and tapping position [8] perceiving tapping vibrations. Susa et al. [4] proposed to simulate natural vibration of object to present arbitrary structured objects. However, the vibration of the tapping finger and subsequent collisions between the finger and the object are not simulated.


international conference on artificial reality and telexistence | 2013

Feeling wind: An interactive haptization system for motion rendering in video contents using SPIDAR

Anusha Jayasiri; Kenji Honda; Katsuhito Akahane; Makoto Sato

We present a method to feel the movement of objects in object rich image sequences using SPIDAR-G haptic device. This method addresses two major drawbacks of our previous research in object motion rendering in an image sequences. On the one hand, it changes the role of the user from passive to active by enabling the user to select the desired object of which she or he needs to feel the movement. On the other hand, it reduces the impact of background noise on haptic feedback by limiting the sensible region for motion force calculation to an area of a specified size around the point selected by the user. This paper presents the details of the proposed method and some preliminary results of an experimental evaluation involving real users. Experimental results show that having haptic feedback enhances the user experience with videos. It further reveals that the proposed system has smooth and immersive interaction with object rich video contents.


international conference on human computer interaction | 2011

Development of a high definition haptic rendering for stability and fidelity

Katsuhito Akahane; Takeo Hamada; Takehiko Yamaguchi; Makoto Sato

In this study, we developed and evaluated a 10kHz high definition haptic rendering system which could display at real-time video-rate (60Hz). Our proposal required both fidelity and stability in a multi-rate system, with a frequency ratio of approximately 160 times. To satisfy these two criteria, there were some problems to be resolved. To achieve only stability, we could use a virtual coupling method to link a haptic display and a virtual object. However, due to its low coupling impedance, this method is not good for realization of fidelity and quality of manipulation. Therefore, we developed a multi-rate system with two level up-samplings for both fidelity and stability of haptic sensation. The first level up-sampling achieved stability by the virtual coupling, and the second level achieved fidelity by 10kHz haptic rendering to compensate for the haptic quality lost from the coupling process. We confirmed that, with our proposed system, we could achieve both stability and fidelity of haptic rendering through a computer simulation and a 6DOF haptic interface (SPIDAR-G) with a rigid object simulation engine.


international conference on human interface and management of information | 2017

Research on High Fidelity Haptic Interface Based on Biofeedback.

Katsuhito Akahane; Makoto Sato

In this paper, we propose a high fidelity haptic interface based on biofeedback. When we interact with a very stiff virtual object in the virtual world by a haptic interface, the haptic interface frequently becomes unstable. We cannot feel the virtual object stably. On the other hand, when we interact with a real object in the real world, the dynamics of our fingers and arm always change and adjust to the appropriate value for the real object. We can feel the real object stably. By using the adaptation, we aimed for achieving the high fidelity haptic interface. In this study, the proposed system measured the grasping force generated by the user interacting with a virtual object by the haptic interface. The system also controlled the coupling impedance between the virtual object and the haptic interface by using the grasping force as biofeedback. In order to measure the grasping force, we developed a new end effector for a string based impedance haptic device SPIDAR-G. We conducted evaluation experiments about the proposed system. The experimental results indicate that the proposed system improved the maximum stiffness of the virtual coupling, and achieved both stability and fidelity by using biofeedback.


international conference on computer graphics and interactive techniques | 2010

A realtime and direct-touch interaction for 3D woven cultural artifact exhibition

Wataru Wakita; Katsuhito Akahane; Masaharu Isshiki; Hiromi T. Tanaka

We propose a realtime and direct-touch interaction system for 3D woven cultural artifact exhibition. Our system enables to exhibit the cultural artifact at high-definition and enables realtime and direct-touch interaction based on our texture-based haptic modeling and rendering techniques. In our techniques, the reaction force is changed according to the pixel value of the 2D texture images of the 3D model surface on the screen.

Collaboration


Dive into the Katsuhito Akahane's collaboration.

Top Co-Authors

Avatar

Makoto Sato

Tokyo Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Shoichi Hasegawa

Tokyo Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Makoto Sato

Tokyo Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Xiangning Liu

Tokyo Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Anusha Jayasiri

Tokyo Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jun Murayama

Tokyo Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Lanhai Liu

Tokyo Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Satoshi Miyake

Tokyo Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Yasuharu Koike

Tokyo Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge