Hirokazu Kato
Hiroshima City University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Hirokazu Kato.
international symposium on mixed and augmented reality | 1999
Hirokazu Kato; Mark Billinghurst
We describe an augmented reality conferencing system which uses the overlay of virtual images on the real world. Remote collaborators are represented on virtual monitors which can be freely positioned about a user in space. Users can collaboratively view and interact with virtual objects using a shared virtual whiteboard. This is possible through precise virtual image registration using fast and accurate computer vision techniques and head mounted display (HMD) calibration. We propose a method for tracking fiducial markers and a calibration method for optical see-through HMD based on the marker tracking.
international symposium on mixed and augmented reality | 2000
Hirokazu Kato; Mark Billinghurst; Ivan Poupyrev; Kenji Imamoto; Keihachiro Tachibana
We address the problems of virtual object interaction and user tracking in a table-top augmented reality (AR) interface. In this setting there is a need for very accurate tracking and registration techniques and an intuitive and useful interface. This is especially true in AR interfaces for supporting face to face collaboration where users need to be able to easily cooperate with each other. We describe an accurate vision-based tracking method for table-top AR environments and tangible user interface (TUI) techniques based on this method that allow users to manipulate virtual objects in a natural and intuitive manner. Our approach is robust, allowing users to cover some of the tracking markers while still returning camera viewpoint information, overcoming one of the limitations of traditional computer vision based systems. After describing this technique we describe its use in prototype AR applications.
Computers & Graphics | 2001
Mark Billinghurst; Hirokazu Kato; Ivan Poupyrev
Abstract The MagicBook is a Mixed Reality interface that uses a real book to seamlessly transport users between Reality and Virtuality. A vision-based tracking method is used to overlay virtual models on real book pages, creating an Augmented Reality (AR) scene. When users see an AR scene they are interested in they can fly inside it and experience it as an immersive Virtual Reality (VR). The interface also supports multi-scale collaboration, allowing multiple users to experience the same virtual environment either from an egocentric or an exocentric perspective. In this paper we describe the MagicBook prototype, potential applications and user feedback.
Communications of The ACM | 2002
Mark Billinghurst; Hirokazu Kato
Blending reality and virtuality, these interfaces let users see each other, along with virtual objects, allowing communication behaviors much more like face-to-face than like screen-based collaboration.
ubiquitous computing | 2002
Adrian David Cheok; Xubo Yang; Zhou Zhi Ying; Mark Billinghurst; Hirokazu Kato
Abstract: This paper presents a novel computer entertainment system which recaptures human touch and physical interaction with the real-world environment as essential elements of the game play, whilst also maintaining the exciting fantasy features of traditional computer entertainment. Our system called ‘Touch-Space’ is an embodied (ubiquitous, tangible, and social) computing based Mixed Reality (MR) game space which regains the physical and social aspects of traditional game play. In this novel game space, the real-world environment is an essential and intrinsic game element, and the human’s physical context influences the game play. It also provides the full spectrum of game interaction experience ranging from the real physical environment (human to human and human to physical world interaction), to augmented reality, to the virtual environment. It allows tangible interactions between players and virtual objects, and collaborations between players in different levels of reality. Thus, the system re-invigorates computer entertainment systems with social human-to-human and human-to-physical touch interactions.
international conference on multimedia and expo | 2000
Mark Billinghurst; Ivan Poupyrev; Hirokazu Kato; Richard May
In the Shared Space project, we explore, innovate, design and evaluate future computing environments that will radically enhance interaction between human and computers as well as interaction between humans mediated by computers. In particular, we investigate how augmented reality enhanced by physical and spatial 3D user interfaces can be used to develop effective face-to-face collaborative computing environments. How will we interact in such collaborative spaces? How will we interact with each other? What new applications can be developed using this technology? These are the questions that we are trying to answer in research on Shared Space. The paper provides a short overview of Shared Space, its directions, technologies and applications.
Presence: Teleoperators & Virtual Environments | 2002
Nicholas R. Hedley; Mark Billinghurst; Lori Postner; Richard May; Hirokazu Kato
In this paper, we describe two explorations in the use of hybrid user interfaces for collaborative geographic data visualization. Our first interface combines three technologies: augmented reality (AR), immersive virtual reality (VR), and computer vision-based hand and object tracking. Wearing a lightweight display with an attached camera, users can look at a real map and see three-dimensional virtual terrain models overlaid on the map. From this AR interface, they can fly in and experience the model immersively, or use free hand gestures or physical markers to change the data representation. Building on this work, our second interface explores alternative interface techniques, including a zoomable user interface, paddle interactions, and pen annotations. We describe the system hardware and software and the implications for GIS and spatial science applications.
international symposium on mixed and augmented reality | 2002
Simon Prince; Adrian David Cheok; Farzam Farbiz; Todd Williamson; N Johnson; Mark Billinghurst; Hirokazu Kato
We present a complete system for live capture of 3D content and simultaneous presentation in augmented reality. The user sees the real world from his viewpoint, but modified so that the image of a remote collaborator is rendered into the scene. Fifteen cameras surround the collaborator, and the resulting video streams are used to construct a three-dimensional model of the subject using a shape-from-silhouette algorithm. Users view a two-dimensional fiducial marker using a video-see-through augmented reality interface. The geometric relationship between the marker and head-mounted camera is calculated, and the equivalent view of the subject is computed and drawn into the scene. Our system can generate 384 /spl times/ 288 pixel images of the models at 25 fps, with a latency of < 100 ms. The result gives the strong impression that the subject is a real part of the 3D scene. We demonstrate applications of this system in 3D videoconferencing and entertainment.
human factors in computing systems | 2001
Mark Billinghurst; Hirokazu Kato; Ivan Poupyrev
The MagicBook explores how interfaces can be developed that allow for seamless transition between Physical Reality, Augmented Reality (AR), and immersive Virtual Reality (VR) in a collaborative setting. The MagicBook is a normal book and can be read without any additional technology. However, when book pages are viewed through a handheld display three-dimensional virtual images appear overlaid on them. Readers can view these AR scenes from any perspective and can also fly into the scenes and experience them as an immersive VR world. VR users can see other VR users represented as life-sized virtual avatars, while AR users will see VR users as miniature avatars in the scene.
international conference on computer graphics and interactive techniques | 2008
Mark Billinghurst; Hirokazu Kato; Ivan Poupyrev
This paper advocates a new metaphor for designing threedimensional Augmented Reality (AR) applications, Tangible Augmented Reality (Tangible AR). Tangible AR interfaces combine the enhanced display possibilities of AR with the intuitive manipulation and interaction of physical objects or Tangible User Interfaces. We define what Tangible AR interfaces are, present some design guidelines and prototype interfaces based on these guidelines. Experiences with these interfaces show that the Tangible AR metaphor supports seamless interaction between the real and virtual worlds, and provides a range of natural interactions that are difficult to find in other AR interfaces. CR Categories: H.5.2 [User Interfaces] Input devices and strategies; H.5.1 [Multimedia Information Systems] Artificial, augmented , and virtual realities.