Mai Otsuki
University of Tsukuba
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mai Otsuki.
user interface software and technology | 2010
Mai Otsuki; Kenji Sugihara; Asako Kimura; Fumihisa Shibata; Hideyuki Tamura
Many digital painting systems have been proposed and their quality is improving. In these systems, graphics tablets are widely used as input devices. However, because of its rigid nib and indirect manipulation, the operational feeling of a graphics tablet is different from that of real paint brush. We solved this problem by developing the MR-based Artistic Interactive (MAI) Painting Brush, which imitates a real paint brush, and constructed a mixed reality (MR) painting system that enables direct painting on physical objects in the real world.
society of instrument and control engineers of japan | 2008
Mai Otsuki; Asako Kimura; Fumihisa Shibata; Hideyuki Tamura
Mixed reality (MR), which merges the real mid virtual worlds in real time, is a form of further extension of the conventional virtual reality (\R). There have been many studies in the field of MR, however, very few studies discuss interaction methods. In this study, we propose two novel methods for interaction with an MR space, RealSound Interaction and ToolDevice.
user interface software and technology | 2012
Ryan Arisandi; Yusuke Takami; Mai Otsuki; Asako Kimura; Fumihisa Shibata; Hideyuki Tamura
ToolDevice is a set of devices developed to help users in spatial work such as layout design and three-dimensional (3D) modeling. It consists of three components: TweezersDevice, Knife/HammerDevice, and BrushDevice, which use hand tool metaphors to help users recognize each devices unique functions. We have developed a mixed reality (MR) 3D modeling system that imitates real-life woodworking using the TweezersDevice and the Knife/HammerDevice. In the system, users can pick up and move virtual objects with the TweezersDevice. Users can also cut and join virtual objects using the Knife/HammerDevice. By repeating these operations, users can build virtual wood models.
Proceedings of the IEEE | 2014
Ryan Arisandi; Mai Otsuki; Asako Kimura; Fumihisa Shibata; Hideyuki Tamura
Most 3-D modeling software are difficult for beginners to learn. The operations are often complicated, and the user is required to have prior mathematical knowledge. Therefore, we developed a simple modeling system using ToolDevice to simplify such operations. ToolDevice consists of a set of interaction devices, which use metaphors of real-life hand tools to help users recognize each devices unique functions. Using TweezersDevice, KnifeDevice, and HammerDevice, we developed a mixed reality (MR) 3-D modeling system that imitates real-life woodworking. In the system, TweezersDevice is used to pick up and move objects, while KnifeDevice and HammerDevice are, respectively, used to cut and join virtual objects represented as wood materials. In this study, we describe the motivation for developing the system, the available interactions, and the procedures for creating 3-D models in the system. We also present the results of a user study in which we compare user performance in our system and a common 3-D modeling software. Finally, we discuss the contributions and limitations of this study and future work.
AsiaHaptics | 2015
Vibol Yem; Mai Otsuki; Hideaki Kuzuoka
An outer-covering haptic display (OCHD) is a device that imparts a guiding force sensation to the back of a learner’s hand and guides the learner to manipulate a tool. Our previous study found that OCHD provides a skin deformation sensation and is able to guide a learner with less drive force than the alternative method where the tool is directly actuated. In this study, we developed a wearable outer-covering haptic display (wOCHD) for hand motion, with two ball effectors to deform the skin and provide a guiding information in four axes of motion.
international conference on computer graphics and interactive techniques | 2009
Yusuke Takami; Mai Otsuki; Asako Kimura; Fumihisa Shibata; Hideyuki Tamura
One day, a boy named Daichi created a sketch of a lovely table and chair, and he wanted to convert them to 3D models. But he realized that he did not know how. His computer skills were limited.
Proceedings of the 2016 ACM International Conference on Interactive Surfaces and Spaces | 2016
Yuji Sano; Koya Sato; Ryoichiro Shiraishi; Mai Otsuki
In order to fill the gap between various levels of ball game players, we propose the Sports Support System that can reduce the difference among the skills of various players by augmenting the traditional ball game. We implemented a system that visualizes the trajectory and velocity of the ball for soccer games, as one example of ball games. This visualization and the enhancement of the player positions in the field can help beginner players recognize what they should do next and improve their playing skills. This paper introduces the proposed system and discusses, through a user study, its effectiveness for improving reaction speed and enhancing the pleasure of playing sports. Based on the experiment results, the proposed system contributes to reaction time improvements in passing and receiving the ball, i.e., one of the most important skills in playing soccer.
human factors in computing systems | 2017
Takumi Azai; Shuhei Ogawa; Mai Otsuki; Fumihisa Shibata; Asako Kimura
Mixed reality (MR) space merges the real and virtual worlds in real time and makes it possible to present and manipulate virtual objects in the real world. However, to manipulate virtual objects, menus are required, and where to display menus in MR space and how to manipulate them are often problems. For example, a virtual touch menu shown in front of a users face cannot provide the user with touch sensation and interferes with the users sight. In this study, we propose a method to display a menu on the users forearm, which is always within reach of the users hand. The user can obtain touch feeling by directly touching their forearm. An application was developed using this menu, and an informal user study at a previous conference was successful, leaving some minor points to be improved.
human factors in computing systems | 2017
Mai Otsuki; Taiki Kawano; Keita Maruyama; Hideaki Kuzuoka; Yusuke Suzuki
A long-standing challenge in video-mediated communication systems is to represent a remote participants gaze direction in local environments correctly. To address this issue, we developed ThirdEye, an add-on eye-display for a video communication system. This display is made from an artificial ulexite (TV rock) that is cut into a hemispherical shape, enabling light from the bottom surface to be projected onto the hemisphere surface. By drawing an appropriate ellipse on an LCD and placing ThirdEye over it, this system simulates an eyeball. Our experiment proved that an observer could perceive a remote Lookers gaze direction more precisely when the gaze was presented using ThirdEye compared to the case in which the gaze was presented using the Lookers face on a flat display.
symposium on 3d user interfaces | 2013
Mai Otsuki; Tsutomu Oshita; Asako Kimura; Fumihisa Shibata; Hideyuki Tamura
In this study, we implemented a stable and intuitive detach method named “Touch & Detach” for 3D complex virtual objects. In typical modeling software, parts of a complex 3D object are grouped for efficient operation, and ungrouped for observing or manipulating a part in detail. Our method uses elastic metaphors to prevent incorrect operations and to improve the operational feel and responsiveness. In addition, our method can represent the connection and its strength between the parts by simulating a virtual elastic band connecting the parts. It helps users to understand the relationship between the parts of a complex virtual object. This paper presents the details of our proposed method and user study.