Atau Tanaka
Goldsmiths, University of London
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Atau Tanaka.
new interfaces for musical expression | 2006
Lalya Gaye; Lars Erik Holmquist; Frauke Behrendt; Atau Tanaka
The new field of mobile music emerges at the intersection of ubiquitous computing, portable audio technology and NIME. We have held a series of international workshop on this topic with leading projects and speakers, in order to establish a community and stimulate the development of the field. In this report, we define mobile music, and map out the field by reporting on the workshop series and accounting for the state-of-the-art.
new interfaces for musical expression | 2002
Atau Tanaka; R. Benjamin Knapp
This paper describes a technique of multimodal, multichannel control of electronic musical devices using two control methodologies, the Electromyogram (EMG) and relative position sensing. Requirements for the application of multimodal interaction theory in the musical domain are discussed. We introduce the concept of bidirectional complementarity to characterize the relationship between the component sensing technologies. Each control can be used independently, but together they are mutually complementary. This reveals a fundamental difference from orthogonal systems. The creation of a concert piece based on this system is given as example.
Ksii Transactions on Internet and Information Systems | 2015
Baptiste Caramiaux; Nicola Montecchio; Atau Tanaka; Frédéric Bevilacqua
This article presents a gesture recognition/adaptation system for human--computer interaction applications that goes beyond activity classification and that, as a complement to gesture labeling, characterizes the movement execution. We describe a template-based recognition method that simultaneously aligns the input gesture to the templates using a Sequential Monte Carlo inference technique. Contrary to standard template-based methods based on dynamic programming, such as Dynamic Time Warping, the algorithm has an adaptation process that tracks gesture variation in real time. The method continuously updates, during execution of the gesture, the estimated parameters and recognition results, which offers key advantages for continuous human--machine interaction. The technique is evaluated in several different ways: Recognition and early recognition are evaluated on 2D onscreen pen gestures; adaptation is assessed on synthetic data; and both early recognition and adaptation are evaluated in a user study involving 3D free-space gestures. The method is robust to noise, and successfully adapts to parameter variation. Moreover, it performs recognition as well as or better than nonadapting offline template-based methods.
ACM Transactions on Computer-Human Interaction | 2015
Baptiste Caramiaux; Marco Donnarumma; Atau Tanaka
Expressivity is a visceral capacity of the human body. To understand what makes a gesture expressive, we need to consider not only its spatial placement and orientation but also its dynamics and the mechanisms enacting them. We start by defining gesture and gesture expressivity, and then we present fundamental aspects of muscle activity and ways to capture information through electromyography and mechanomyography. We present pilot studies that inspect the ability of users to control spatial and temporal variations of 2D shapes and that use muscle sensing to assess expressive information in gesture execution beyond space and time. This leads us to the design of a study that explores the notion of gesture power in terms of control and sensing. Results give insights to interaction designers to go beyond simplistic gestural interaction, towards the design of interactions that draw on nuances of expressive gesture.
human factors in computing systems | 2013
Frédéric Bevilacqua; Sidney S. Fels; Alexander Refsum Jensenius; Michael J. Lyons; Norbert Schnell; Atau Tanaka
This SIG intends to investigate the ongoing dialogue between music technology and the field of human-computer interaction. Our specific aims are to consider major findings of musical interface research over recent years and discuss how these might best be conveyed to CHI researchers interested but not yet active in this area, as well as to consider how to stimulate future collaborations between music technology and CHI research communities. \
human factors in computing systems | 2016
Marco Gillies; Rebecca Fiebrink; Atau Tanaka; Jérémie Garcia; Frédéric Bevilacqua; Alexis Heloir; Fabrizio Nunnari; Wendy E. Mackay; Saleema Amershi; Bongshin Lee; Nicolas D'Alessandro; Joëlle Tilmanne; Todd Kulesza; Baptiste Caramiaux
Machine learning is one of the most important and successful techniques in contemporary computer science. It involves the statistical inference of models (such as classifiers) from data. It is often conceived in a very impersonal way, with algorithms working autonomously on passively collected data. However, this viewpoint hides considerable human work of tuning the algorithms, gathering the data, and even deciding what should be modeled in the first place. Examining machine learning from a human-centered perspective includes explicitly recognising this human work, as well as reframing machine learning workflows based on situated human working practices, and exploring the co-adaptation of humans and systems. A human-centered understanding of machine learning in human context can lead not only to more usable machine learning tools, but to new ways of framing learning computationally. This workshop will bring together researchers to discuss these issues and suggest future research questions aimed at creating a human-centered approach to machine learning.
human factors in computing systems | 2015
Baptiste Caramiaux; Alessandro Altavilla; Scott G. Pobiner; Atau Tanaka
Sonic interaction is the continuous relationship between user actions and sound, mediated by some technology. Because interaction with sound may be task oriented or experience-based it is important to understand the nature of action-sound relationships in order to design rich sonic interactions. We propose a participatory approach to sonic interaction design that first considers the affordances of sounds in order to imagine embodied interaction, and based on this, generates interaction models for interaction designers wishing to work with sound. We describe a series of workshops, called Form Follows Sound, where participants ideate imagined sonic interactions, and then realize working interactive sound prototypes. We introduce the Sonic Incident technique, as a way to recall memorable sound experiences. We identified three interaction models for sonic interaction design: conducting; manipulating; substituting. These three interaction models offer interaction designers and developers a framework on which they can build richer sonic interactions.
human factors in computing systems | 2013
Baptiste Caramiaux; Frédéric Bevilacqua; Atau Tanaka
Gesture-based interaction is widespread in touch screen interfaces. The goal of this paper is to tap the richness of expressive variation in gesture to facilitate continuous interaction. We achieve this through novel techniques of adaptation and estimation of gesture characteristics. We describe two experiments. The first aims at understanding whether users can control certain gestural characteristics and if that control depends on gesture vocabulary. The second study uses a machine learning technique based on particle filtering to simultaneously recognize and measure variation in a gesture. With this technology, we create a gestural interface for a playful photo processing application. From these two studies, we show that 1) multiple characteristics can be varied independently in slower gestures (Study 1), and 2) users find gesture-only interaction less pragmatic but more stimulating than traditional menu-based systems (Study 2).
creativity and cognition | 2011
Lalya Gaye; Atau Tanaka
We describe a collaborative design project with a group of young people in which an interactive educational information pack for teenagers was implemented. Instead of just providing input to a design project, the young people initiated, controlled and partially implemented the project themselves, with the support of an interdisciplinary research team. Here we present this approach to participatory design research, describe the design process and show that initiative, control, and hands-on engagement in youth-led collaborative design, can bring to the young people a strong sense of ownership and empowerment.
human factors in computing systems | 2006
Kumiyo Nakakoji; Atau Tanaka; Daniel Fallman
The workshop seeks to bring together researchers and practitioners from diverse creative practices such as interaction design, industrial design, architectural design, media art, music, programming, writing, and scholarly work, to gain insight into the creative process. Each of these disciplines has established ways to nurture a creative impulse through to a concrete result. This is done in part by fostering a continuing internal dialog between creative instinct and external representations. Sketching is an activity common to these practices that is exercised during such creative refinement. By sketching, we mean not only hand-drawing on paper using a pencil, but also rapid, undetailed, brief, light, informal representations that practitioners produce and interact with. By investigating the sketching process in each practice, we expect to find commonalities that will to point out essential elements for designing tools to support the creative process.