Jun Rekimoto
University of Tokyo
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jun Rekimoto.
human factors in computing systems | 2002
Jun Rekimoto
This paper introduces a new sensor architecture for making interactive surfaces that are sensitive to human hand and finger gestures. This sensor recognizes multiple hand positions and shapes and calculates the distance between the hand and the surface by using capacitive sensing and a mesh-shaped antenna. In contrast to camera-based gesture recognition systems, all sensing elements can be integrated within the surface, and this method does not suffer from lighting and occlusion problems. This paper describes the sensor architecture, as well as two working prototype systems: a table-size system and a tablet-size system. It also describes several interaction techniques that would be difficult to perform without using this architecture
user interface software and technology | 1997
Jun Rekimoto
This paper proposes a new field of user interfaces called multi-computer direct manipulation and presents a penbased direct manipulation technique that can be used for data transfer between different computers as well as within the same computer. The proposed Pick-andDrop allows a user to pick up an object on a display and drop it on another display as if he/she were manipulating a physical object. Even though the pen itself does not have storage capabilities, a combination of Pen-ID and the pen manager on the network provides the illusion that the pen can physically pick up and move a computer object. Based on this concept, we have built several experimental applications using palm-sized, desk-top, and wall-sized pen computers. We also considered the importance of physical artifacts in designing user interfaces in a future computing environment.
human factors in computing systems | 1999
Jun Rekimoto; Masanori Saitoh
This paper describes our design and implementation of a computeraugmented environment that allows users to smoothly interchangedigital information among their portable computers, table and walldisplays, and other physical objects. Supported by a camera-basedobject recognition system, users can easily integrate theirportable computers with the pre-installed ones in the environment.Users can use displays projected on tables and walls as a spatiallycontinuous extension of their portable computers. Using aninteraction technique called hyperdragging, users can transferinformation from one computer to another, by only knowing thephysical relationship between them. We also provide a mechanism forattaching digital data to physical objects, such as a videotape ora document folder, to link physical and digital spaces.
Proceedings of DARE 2000 on Designing augmented reality environments | 2000
Jun Rekimoto; Yuji Ayatsuka
The CyberCode is a visual tagging system based on a 2D-barcode technology and provides several features not provided by other tagging systems. CyberCode tags can be recognized by the low-cost CMOS or CCD cameras found in more and more mobile devices, and it can also be used to determine the 3D position of the tagged object as well as its ID number. This paper describes examples of augmented reality applications based on CyberCode, and discusses some key characteristics of tagging technologies that must be taken into account when designing augmented reality environments.
user interface software and technology | 1995
Jun Rekimoto; Katashi Nagao
Current user interface techniques such as WIMP or the desktop metaphor do not support real world tasks, because the focus of these user interfaces is only on human–computer interactions, not on human–real world interactions. In this paper, we propose a method of building computer augmented environments using a situation-aware portable device. This device, calledNaviCam, has the ability to recognize the user’s situation by detecting color-code IDs in real world environments. It displays situation sensitive information by superimposing messages on its video see-through screen. Combination of ID-awareness and portable video-see-through display solves several problems with current ubiquitous computers systems and augmented reality systems.
user interface software and technology | 1996
Jun Rekimoto
This TechNote introduces new interaction techniques for small screen devices such as palmtop computers or handheld electric devices, including pagers and cellular phones. Our proposed method uses the tilt of the device itself as input. Using both tilt and buttons, it is possible to build several interaction techniques ranging from menus and scroll bars, to more complicated examples such as a map browsing system and a 3D object viewer. During operation, only one hand is required to both hold and control the device. This feature is especially useful for field workers.
international symposium on wearable computers | 2001
Jun Rekimoto
In this paper we introduce two input devices for wearable computers, called GestureWrist and GesturePad. Both devices allow users to interact with wearable or nearby computers by using gesture-based commands. Both are designed to be as unobtrusive as possible, so they can be used under various social contexts. The first device, called GestureWrist, is a wristband-type input device that recognizes hand gestures and forearm movements. Unlike DataGloves or other hand gesture-input devices, all sensing elements are embedded in a normal wristband. The second device, called GesturePad, is a sensing module that can be attached on the inside of clothes, and users can interact with this module from the outside. It transforms conventional clothes into an interactive device without changing their appearance.
user interface software and technology | 2002
Ivan Poupyrev; Shigeaki Maruyama; Jun Rekimoto
This paper investigates the sense of touch as a channel for communicating with miniature handheld devices. We embedded a PDA with a TouchEngineTM --- a thin, miniature lower-power tactile actuator that we have designed specifically to use in mobile interfaces (Figure 1). Unlike previous tactile actuators, the TouchEngine is a universal tactile display that can produce a wide variety of tactile feelings from simple clicks to complex vibrotactile patterns. Using the TouchEngine, we began exploring the design space of interactive tactile feedback for handheld computers. Here, we investigated only a subset of this space: using touch as the ambient, background channel of interaction. We proposed a general approach to design such tactile interfaces and described several implemented prototypes. Finally, our user studies demonstrated 22% faster task completion when we enhanced handheld tilting interfaces with tactile feedback.
user interface software and technology | 1997
Nobuyuki Matsushita; Jun Rekimoto
This TechNote reports on our initial results of realizing a computer augmented wall called the HoloWall. Using an infrared camera located behind the wall, this system allows a user to interact with this computerized wall using ngers, hands, their body, or even a physical object such as a document folder.
international symposium on wearable computers | 1998
Jun Rekimoto; Yuji Ayatsuka; Kazuteru Hayashi
Most existing augmented reality systems only provide a method for browsing information that is situated in the real world context. This paper describes a system that allows users to dynamically attach newly created digital information such as voice notes photographs to the physical environment, through wearable computers as well as normal computers. Attached data is stored with contextual tags such as location IDs and object IDs that are obtained by wearable sensors, so the same or other wearable users can notice them when they come to the same context. Similar to the role that Post-it notes play in community messaging, we expect our proposed method to be a fundamental communication platform when wearable computers become commonplace.