Jun Kato
National Institute of Advanced Industrial Science and Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jun Kato.
programming language design and implementation | 2013
Sebastian Burckhardt; Manuel Fähndrich; Peli de Halleux; Sean McDirmid; Michal Moskal; Nikolai Tillmann; Jun Kato
Live programming allows programmers to edit the code of a running program and immediately see the effect of the code changes. This tightening of the traditional edit-compile-run cycle reduces the cognitive gap between program code and execution, improving the learning experience of beginning programmers while boosting the productivity of seasoned ones. Unfortunately, live programming is difficult to realize in practice as imperative languages lack well-defined abstraction boundaries that make live programming responsive or its feedback comprehensible. This paper enables live programming for user interface programming by cleanly separating the rendering and non-rendering aspects of a UI program, allowing the display to be refreshed on a code change without restarting the program. A type and effect system formalizes this separation and provides an evaluation model that incorporates the code update step. By putting live programming on a more formal footing, we hope to enable critical and technical discussion of live programming systems.
human factors in computing systems | 2009
Jun Kato; Daisuke Sakamoto; Masahiko Inami; Takeo Igarashi
We must give some form of a command to robots in order to have the robots do a complex task. An initial instruction is required even if they do their tasks autonomously. We therefore need interfaces for the operation and teaching of robots. Natural languages, joysticks, and other pointing devices are currently used for this purpose. These interfaces, however, have difficulty in operating multiple robots simultaneously. We developed a multi-touch interface with a top-down view from a ceiling camera for controlling multiple mobile robots. The user specifies a vector field followed by all robots on the view. This paper describes the user interface and its implementation, and future work of the project.
user interface software and technology | 2012
Jun Kato; Sean McDirmid; Xiang Cao
The increasing popularity of interactive camera-based programs highlights the inadequacies of conventional IDEs in developing these programs given their distinctive attributes and workflows. We present DejaVu, an IDE enhancement that eases the development of these programs by enabling programmers to visually and continuously monitor program data in consistency with the frame-based pipeline of computer-vision programs; and to easily record, review, and reprocess temporal data to iteratively improve the processing of non-reproducible camera input. DejaVu was positively received by three experienced programmers of interactive camera-based programs in our preliminary user trial.
human factors in computing systems | 2015
Jun Kato; Tomoyasu Nakano; Masataka Goto
This paper presents TextAlive, a graphical tool that allows interactive editing of kinetic typography videos in which lyrics or transcripts are animated in synchrony with the corresponding music or speech. While existing systems have allowed the designer and casual user to create animations, most of them do not take into account synchronization with audio signals. They allow predefined motions to be applied to objects and parameters to be tweaked, but it is usually impossible to extend the predefined set of motion algorithms within these systems. We therefore propose an integrated design environment featuring (1) GUIs that designers can use to create and edit animations synchronized with audio signals, (2) integrated tools that programmers can use to implement animation algorithms, and (3) a framework for bridging the interfaces for designers and programmers. A preliminary user study with designers, programmers, and casual users demonstrated its capability in authoring various kinetic typography videos.
human factors in computing systems | 2013
Jun Kato; Daisuke Sakamoto; Takeo Igarashi
Current programming environments use textual or symbolic representations. While these representations are appropriate for describing logical processes, they are not appropriate for representing raw values such as human and robot posture data, which are necessary for handling gesture input and controlling robots. To address this issue, we propose Picode, a text-based development environment integrated with visual representations: photos of human and robots. With Picode, the user first takes a photo to bind it to posture data. S/he then drag-and-drops the photo into the code editor, where it is displayed as an inline image. A preliminary in-house user study implied positive effects of taking photos on the programming experience.
designing interactive systems | 2012
Jun Kato; Daisuke Sakamoto; Takeo Igarashi
There are many toolkits for physical UIs, but most physical UI applications are not locomotive. When the programmer wants to make things move around in the environment, he faces difficulty related to robotics. Toolkits for robot programming, unfortunately, are usually not as accessible as those for building physical UIs. To address this interdisciplinary issue, we propose Phybots, a toolkit that allows researchers and interaction designers to rapidly prototype applications with locomotive robotic things. The contributions of this research are the combination of a hardware setup, software API, its underlying architecture and a graphical runtime debug tool that supports the whole prototyping activity. This paper introduces the toolkit, applications and lessons learned from three user studies.
user interface software and technology | 2010
Jun Kato; Daisuke Sakamoto; Takeo Igarashi
We introduce a technique to detect simple gestures of surfing (moving a hand horizontally) on a standard keyboard by analyzing recorded sounds in real-time with a microphone attached close to the keyboard. This technique allows the user to maintain a focus on the screen while surfing on the keyboard. Since this technique uses a standard keyboard without any modification, the user can take full advantage of the input functionality and tactile quality of his favorite keyboard supplemented with our interface.
international conference on computer graphics and interactive techniques | 2009
Thomas Seifried; Christian Rendl; Florian Perteneder; Jakob Leitner; Michael Haller; Daisuke Sakamoto; Jun Kato; Masahiko Inami; Stacey D. Scott
The amount of digital appliances and media found in domestic environments has risen drastically over the last decade, for example, digital TVs, DVD and Blu-ray players, digital picture frames, digital gaming systems, electronically moveable window blinds, and robotic vacuum cleaners. As these devices become more compatible to Internet and wireless networking (e.g. Internet-ready TVs, streaming digital picture frames, and WiFi gaming systems, such as Nintendos Wii and Sonys Playstation) and as networking and WiFi home infrastructures become more prevalent, new opportunities arise for developing centralized control of these myriad devices and media into so called Universal remote controls. However, many remote controls lack intuitive interfaces for mapping control functions to the device intended being controlled. This often results in trial and error button pressing, or experimentation with graphical user interface (GUI) controls, before a user achieves their intended action.
IEEE Computer | 2016
Jun Kato; Takeo Igarashi; Masataka Goto
The programming-with-examples workflow lets developers create interactive applications with the help of example data. It takes a general programming environment and adds dedicated user interfaces for visualizing and managing the data. This lets both programmers and users understand applications and configure them to meet their needs.
intelligent user interfaces | 2016
Tomoyasu Nakano; Jun Kato; Masahiro Hamasaki; Masataka Goto
We propose a novel interface that allows the user to interactively change the playback order of multiple songs by choosing one or more criteria. The criteria include not only the songs title and artist name but also its content automatically estimated by music/singing signal processing and artist-level social analysis. The artist-level social information is discovered from Wikipedia and DBpedia. With regard to manipulating playback order, existing interfaces typically allow the user to change it manually or automatically by choosing one of a few types of criteria. The proposed interface, on the other hand, deals with nine properties and multiple integrations of them (e.g., vocal gender and beats per minute). To realize the ordering by multiple criteria, a distance matrix is computed from the criteria vectors and is then used to estimate paths for ascending, descending, and random orders by applying principle component analysis or to estimate a path for a smooth order by solving the travelling salesman problem.
Collaboration
Dive into the Jun Kato's collaboration.
National Institute of Advanced Industrial Science and Technology
View shared research outputsNational Institute of Advanced Industrial Science and Technology
View shared research outputs