Yoichi Ochiai
University of Tsukuba
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Yoichi Ochiai.
PLOS ONE | 2014
Yoichi Ochiai; Takayuki Hoshi; Jun Rekimoto
The essence of levitation technology is the countervailing of gravity. It is known that an ultrasound standing wave is capable of suspending small particles at its sound pressure nodes. The acoustic axis of the ultrasound beam in conventional studies was parallel to the gravitational force, and the levitated objects were manipulated along the fixed axis (i.e. one-dimensionally) by controlling the phases or frequencies of bolted Langevin-type transducers. In the present study, we considered extended acoustic manipulation whereby millimetre-sized particles were levitated and moved three-dimensionally by localised ultrasonic standing waves, which were generated by ultrasonic phased arrays. Our manipulation system has two original features. One is the direction of the ultrasound beam, which is arbitrary because the force acting toward its centre is also utilised. The other is the manipulation principle by which a localised standing wave is generated at an arbitrary position and moved three-dimensionally by opposed and ultrasonic phased arrays. We experimentally confirmed that expanded-polystyrene particles of 0.6 mm, 1 mm, and 2 mm in diameter could be manipulated by our proposed method.
international conference on computer graphics and interactive techniques | 2015
Yoichi Ochiai; Kota Kumagai; Takayuki Hoshi; Jun Rekimoto; Satoshi Hasegawa; Yoshio Hayasaki
We envision a laser-induced plasma technology in general applications for public use. If laser-induced plasma aerial images were made available, many useful applications such as spatial aerial AR, aerial user interfaces, volumetric images could be produced. This would be a highly effective display for the expression of three-dimensional information. Volumetric expression has considerable merit because the content scale corresponds to the human body; therefore, this technology could be usefully applied to wearable materials and spatial user interactions. Further, laser focusing technology can add an additional dimension to conventional projection technology, which is designed for surface mapping, while laser focusing technology is capable of volumetric mapping. This technology can be effectively used in real-world-oriented user interfaces.
Japanese Journal of Applied Physics | 2014
Takayuki Hoshi; Yoichi Ochiai; Jun Rekimoto
A three-dimensional acoustic manipulation in air is presented. Two arrays of ultrasonic transducers are arranged opposite each other, generating a localized standing wave at an arbitrary position through the phased-array focusing technique. Small particles are suspended in the nodes of the standing wave and also manipulated according to the position of the standing wave. This paper gives the following principles of the proposed method: the theory of acoustic levitation, the ultrasonic phased array, and the estimation of the radial and axial forces. It was experimentally confirmed that particles of 0.6 mm diameter are trapped in the nodes. The length of the localized standing wave, the suspension endurance, and the size of the work space were investigated. It was also demonstrated that a mass of particles can be scooped up when the localized standing wave moves through the mass.
human factors in computing systems | 2016
Yoichi Ochiai; Kota Kumagai; Takayuki Hoshi; Satoshi Hasegawa; Yoshio Hayasaki
We present a new method of rendering aerial haptic images that uses femtosecond-laser light fields and ultrasonic acoustic fields. In conventional research, a single physical quantity has been used to render aerial haptic images. In contrast, our method combines multiple fields (light and acoustic fields) at the same time. While these fields have no direct interference, combining them provides benefits such as multi-resolution haptic images and a synergistic effect on haptic perception. We conducted user studies with laser haptics and ultrasonic haptics separately and tested their superposition. The results showed that the acoustic field affects the tactile perception of the laser haptics. We explored augmented reality/virtual reality (AR/VR) applications such as providing haptic feedback of the combination of these two methods. We believe that the results of this study contribute to the exploration of laser haptic displays and expand the expression of aerial haptic displays based on other principles.
advances in computer entertainment technology | 2013
Yoichi Ochiai; Alexis Oyama; Takayuki Hoshi; Jun Rekimoto
It is difficult to dynamically change the optical properties of ordinary screens. In conventional projection systems, the choice of screens is limited; and the brightness of projected images and the viewing angle are unalterable once a screen is fixed, even though demand for altering the viewing angle according to the locations and the requirements of installations exists. The results of a study conducted by us indicate that a colloidal membrane can be used as a screen by vibrating it at a high frequency using ultrasonic waves. On the basis of those results, in this paper we discuss the implementation of a screen that allows us to dynamically change its brightness and view angle. We also discuss our investigation of its optical characteristics. Our investigations reveal that the screen can be deformed by stronger ultrasonic waves, frames of various shapes can be used to create it, and that we can interact with it by inserting our fingers because it is made of colloidal solution.
international conference on computer graphics and interactive techniques | 2012
Yoichi Ochiai; Alexis Oyama; Keisuke Toyoshima
It is a common knowledge that the surface of soap bubble is a micro membrane. It allows light to pass through and displays the color on its structure. We developed an ultra thin and flexible BRDF screen using the mixture of two colloidal liquids. There have been several researches on dynamic BRDF display[1] in the past. However, our work is different in several points. Our membrane screen can be controlled using ultrasonic vibrations. Membrane can change its transparency and surface states depending on the scales of ultrasonic waves. Based on these facts, we developed several applications of the membranes such as 3D volume screen.
international conference on human haptic sensing and touch enabled computer applications | 2014
Yoichi Ochiai; Takayuki Hoshi; Jun Rekimoto; Masaya Takasaki
In this study, we develop and implement a method for transforming real-world textures. By applying a squeeze film effect to real-world textures, we make haptic textures reduced. This method could transform real-world textures, e.g., from paper-like to metal-like, from wood-like to paper-like, and so on. The textures provided by this system are inherently high resolution because real-world textures are used instead of synthesized data. We implemented a system using a 28-kHz transducer. Evaluations were conducted using a three-axis accelerometer.
ieee global conference on consumer electronics | 2013
Yoichi Ochiai; Takayuki Hoshi; Alexis Oyama; Jun Rekimoto
We previously developed a display using a soap film as a screen. This screen can display various appearances of the projected images by changing its reflectance property by controlling ultrasound waves. Further, the soap film has other advantages of being very thin and disposable. This research aims to make use of these advantages to realize new interactions with a display. The soap screen pops out and breaks. Users can insert their fingers into the screen. When the screen breaks, it can be replaced easily. This display is expected to contribute to entertainment computing communities by acting as a deformable and physically interactive display. In this paper, the details of the proposed display, the related experimental results, discussion and future work are presented.
augmented human international conference | 2017
Ayaka Ebisu; Satoshi Hashizume; Kenta Suzuki; Akira Ishii; Mose Sakashita; Yoichi Ochiai
In musical performances, it is important to produce rhythms correctly. However, when beginners play musical instruments, it can be difficult for them to understand rhythms using only visual and auditory rhythm information. To solve this problem, we propose the Stimulated Percussions (SP) system, which generates rhythms on a computer and transfers them to a users muscles. In this study, we control the users arms and legs using electrical muscle stimulation (EMS). We attach electrodes near certain arm and leg muscles, and provide stimulation in a manner that allows users to reproduce the correct movement when they play instruments. Our system enables a single player or multiple players to correctly reproduce generated rhythms. Experimental results show that our system is useful for beginners learning musical instruments, because it allows accurate rhythms to be mastered through bodily sensations.
user interface software and technology | 2016
Akira Ishii; Ippei Suzuki; Shinji Sakamoto; Keita Kanai; Kazuki Takazawa; Hiraku Doi; Yoichi Ochiai
We present a novel manipulation method that subconsciously changes the walking direction of users via visual processing on a head mounted display (HMD). Unlike existing navigation systems that require users to recognize information and then follow directions as two separate, conscious processes, the proposed method guides users without them needing to pay attention to the information provided by the navigation system and also allows them to be graphically manipulated by controllers. In the proposed system, users perceive the real world by means of stereo images provided by a stereo camera and the HMD. Specifically, while walking, the navigation system provides users with real-time feedback by processing the images they have just perceived and giving them visual stimuli. This study examined two image-processing methods for manipulation of humans walking direction: moving stripe pattern and changing focal region. Experimental results indicate that the changing focal region method most effectively leads walkers as it changes their walking path by approximately 200 mm/m on average.