Kikuo Asai
The Open University of Japan
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Kikuo Asai.
virtual reality continuum and its applications in industry | 2010
Kikuo Asai; Yuji Sugimoto; Mark Billinghurst
We have developed a lunar surface navigation system with a combination of tabletop augmented reality and virtual environments, which facilitates collaboration between children and parents with active learning behaviors. It visualizes multimedia data using geographic locations in the areas of the lunar surface that were explored by NASA Apollo missions. We designed scenarios based on real episodes during exploration activities and assumed a quasi-role play with children acting as astronauts and their parents acting as mission commanders. We did a user study to investigate the properties of the lunar surface navigation system. The experiment was done in a practical setting at an exhibit in the Modern Industrial Science Museum. The results suggested that the lunar surface navigation system could provide a learning environment where children and their parents could initiatively learn together.
Archive | 2010
Kikuo Asai
This chapter is a study on the visualization techniques of geographic information in an augmented reality environment, focusing on its user interface. Augmented reality is a technology that superimposes information over a real scene, enhancing the real world with the scene-linked information. The information is overlaid onto the real scene, dynamically based on the user’s viewpoint. When mixed reality is defined as a continuum of environments spread between reality and virtuality (Milgram et al., 1994), augmented reality is ranked within a certain range along the reality-virtuality continuum. An augmented reality environment has the following characteristics: 1) It combines real and virtual worlds; 2) It is interactive in real-time; and 3) Virtual objects are spatially registered in three dimensional (3D) spaces (Azuma, 1997). In this chapter, however, the term “augmented reality” is used as a 3D user interface that provides a user with an interactive environment. Moreover, a desktop monitor is included as a part of the interactive tabletop environment for presenting the visual scene captured by a video camera, though a head-mounted display (HMD) has typically been used as a display device. Comparison of the user’s experiences between the HMD and the licquid crystal display (LCD) monitor showed that the LCD was easier to use than the HMD under the certain conditions (Asai & Kobayashi, 2007). It was because visual instability was caused by unstable images captured by a camera attached to the HMD. The coordinate system between workspace and visual feedback is th en inconsistent in the environment using a desktop monitor, which requires user’s mental transformatio n. In terms of interaction techniques, on the other hand, this tabletop environment gives us great merits of visual stability and tangible interaction with information or virtual objects. There have been various visualization applications based on geographic information using the augmented reality technology. Our aim is here to introduce the basic concept of geographic information visualization using augmented reality, report an example with our lunar surface browsing tools, and show the features of the user interface in the geographic information visualization using map-based augmented reality. First, the basic concept of geographic information visualization system is introduced by briefly describing general technology and its relevant applications of augmented reality. The geographic information visualization using augmented reality is mainly categorized into two styles: visualizing navigation information using GIS (geographic information system) in the real world and visualizing geographically embedded information using a real map on
virtual reality continuum and its applications in industry | 2010
Noritaka Osawa; Kikuo Asai
A rotation adjustment method for precisely and efficiently manipulating a virtual 3D object by hand in an immersive virtual environment is proposed. A relative direction between both hands adjusts the rotation of the virtual object. This adjustment method also uses spherical linear interpolation based on quaternion algebra to scale down rotations, making small rotation adjustment easier. The scaled adjustment enables a user to precisely manipulate the virtual object. Activation of the rotation adjustment is controlled by the distance between both hands or by the distance between the thumb and forefinger. This rotation adjustment method was implemented as well as the translational position, viewpoint, and release adjustment methods. Combinations of these adjustment and control methods were evaluated in an experiment. The experimental results suggest that the rotation adjustment method helps a user who cannot precisely control rotation by his/her hand pinching a virtual object.
2016 20th International Conference Information Visualisation (IV) | 2016
Kikuo Asai; Norio Takase; Makoto Sato
We have developed a haptic visualization system for molecular docking applications in education. Although molecular docking is a 3D phenomenon, desktop or laptop PCs require a mouse as a common interface, which limits manipulations to 2D. Our system applies 2D lateral forces to haptic feedback using the SPIDAR-mouse and maps the surface of a 3D object to the 2D plane. The SPIDAR-mouse is a simple haptic device with 2D lateral forces, which is considered suitable for educational environments. In this paper, we discuss the haptic visualization techniques using 2D development view and a processing method for generating cuts on the 3D object surface for a prototype of our system. Our visualization system enables users to intuitively interact with a 3D object using 2D lateral force feedback.
2015 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE) | 2015
Kikuo Asai; Norio Takase; Makoto Sato
The SPIDAR-mouse is a string-based haptic device that provides a PC mouse with 2D lateral force. Although the SPIDAR-mouse is convenient for displaying haptic feedback in a PC environment, it requires presenting 3D geometric shapes with 2D lateral force. An impulse-based haptic rendering method was used in our previous system, but the system response was insufficient for achieving realistic haptic interaction due to a huge number of polygons on the synthesized model. We revisit a gradient-based method combined with a penalty-based method to simulate contact between bumps and a probe in haptic rendering. A preliminary experiment was performed to evaluate the haptic rendering method for the lateral force feedback. The results show that the gradient-based method is comparable in terms of task performance to the impulse-based method for simple models with a small number of polygons.
2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE) Proceedings | 2014
Kikuo Asai; Norio Takase; Makoto Sato
We have developed a haptic interaction system that enables us to interact with a synthesized model of real objects using the SPIDAR-mouse. Although the SPIDAR-mouse is convenient for displaying haptic feedback in a PC environment, the force presented with the SPIDAR-mouse is limited to 2D lateral force. Although a gradient-based method is typically used for generating 2D lateral force in haptic rendering, it did not have sufficient performance in representing geometric shapes. We therefore introduced an impulse-based method to the haptic rendering in representing bump information with 2D lateral force. In addition, a polygon reduction function was implemented to reduce the number of polygons in the synthesized model, improving performance in response. A preliminary experiment was conducted to evaluate the haptic interaction system. The results indicated that users were able to distinguish 3D shapes with distinctive geometric conditions, but had difficulty distinguishing fine differences such as similar types of bumps.
systems, man and cybernetics | 2013
Kikuo Asai; Norio Takase; Makoto Sato
A haptic visualization system was made with the SPIDAR-mouse for 2D lateral force feedback in molecular docking. An educational environment requires factors such as easy installation, low cost, and reasonable performance, which the SPIDAR-mouse can provide. However, it limits the representation of force to 2D, even though the molecular interaction includes 3D phenomena. Our approach to the solution is to map the surface of 3D molecules to the 2D plane. The haptic rendering is done by calculating forces with the 2D map, which largely reduces the computational cost. The molecular interaction based on the electro-static potential and conformational properties is displayed with the 2D lateral force feedback using the SPIDAR-mouse.
International Journal of Virtual and Personal Learning Environments | 2011
Kikuo Asai; Norio Takase
This article presents the characteristics of using a tangible tabletop environment produced by augmented reality AR, aimed at improving the environment in which learners observe three-dimensional molecular structures. The authors perform two evaluation experiments. A performance test for a user interface demonstrates that learners with a tangible AR environment were able to complete the task of identifying molecular structures more quickly and accurately than those with a typical desktop-PC environment using a Web browser. A usability test by participants who learned molecular structures and answered relevant questions demonstrates that the environments had no effect on their learning of molecular structures. However, a preference test reveals that learners preferred a more tangible AR environment to a Web-browser environment in terms of overall enjoyment, reality of manipulation, and sense of presence, and vice versa in terms of ease of viewing, experience, and durability.
Transactions on edutainment IV | 2010
Kikuo Asai; Tomotsugu Kondo; Akira Mizuki; Mark Billinghurst
The lunar surface collaborative browsing system was developed for exhibitions at science museums. It visualizes multimedia data using the geographic locations in the area of the lunar surface explored by the NASA Apollo missions, providing visitors with a collaborative-learning environment through networked interactive functions. We designed scenarios based on real episodes during exploration activities and assumed a quasi-role-play with children acting as astronauts and their parents acting as mission commanders. Children manipulate a rover on the lunar surface and view a landscape at a viewpoint on the lunar surface. Parents instruct their children to find objects or information to complete tasks. Our contribution is to create a collaborative-learning environment by integrating map-based and virtual environments to view the lunar surface from exocentric and egocentric viewpoints. The system has capabilities that encourage children and their parents to get together to learn.
virtual systems and multimedia | 2009
Kikuo Asai; Kimio Kondo; Hideaki Kobayashi
Despite convenient functions supporting videoconferencing, it is often difficult to maintain awareness and attention during remote lectures. One possible approach for addressing such difficulties is to present using a flip board held by a presenter, instead of presentation slides usually displayed on a different screen away from the presenter’s face. This flip-board-based presentation style makes it easy for the viewers to see the presenter’s face and presentation materials simultaneously, facilitating eye contact. We performed experimental lectures to investigate the properties of flip-board and conventional presentation styles. The subjective evaluation showed that the participants scored the flip-board-based presentation higher in terms of eye contact than the traditional presentation style, which they felt was somewhat ill-suited to them.