Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Rebecca Fiebrink is active.

Publication


Featured researches published by Rebecca Fiebrink.


new interfaces for musical expression | 2007

Don't forget the laptop: using native input capabilities for expressive musical control

Rebecca Fiebrink; Ge Wang; Perry R. Cook

We draw on our experiences with the Princeton Laptop Orchestra to discuss novel uses of the laptops native physical inputs for flexible and expressive control. We argue that instruments designed using these built-in inputs offer benefits over custom standalone controllers, particularly in certain group performance settings; creatively thinking about native capabilities can lead to interesting and unique new interfaces. We discuss a variety of example instruments that use the laptops native capabilities and suggest avenues for future work. We also describe a new toolkit for rapidly experimenting with these capabilities.


human factors in computing systems | 2009

Dynamic mapping of physical controls for tabletop groupware

Rebecca Fiebrink; Dan Morris; Meredith Ringel Morris

Multi-touch interactions are a promising means of control for interactive tabletops. However, a lack of precision and tactile feedback makes multi-touch controls a poor fit for tasks where precision and feedback are crucial. We present an approach that offers precise control and tactile feedback for tabletop systems through the integration of dynamically re-mappable physical controllers with the multi-touch environment, and we demonstrate this approach in our collaborative tabletop audio editing environment. An observational user study demonstrates that our approach can provide needed precision and feedback, while preserving the collaborative benefits of a shared direct-manipulation surface. Our observations also suggest that direct touch and physical controllers can offer complementary benefits, and that providing both allows users to adjust their control strategy based on considerations including precision, convenience, visibility, and user role.


human factors in computing systems | 2016

Human-Centred Machine Learning

Marco Gillies; Rebecca Fiebrink; Atau Tanaka; Jérémie Garcia; Frédéric Bevilacqua; Alexis Heloir; Fabrizio Nunnari; Wendy E. Mackay; Saleema Amershi; Bongshin Lee; Nicolas D'Alessandro; Joëlle Tilmanne; Todd Kulesza; Baptiste Caramiaux

Machine learning is one of the most important and successful techniques in contemporary computer science. It involves the statistical inference of models (such as classifiers) from data. It is often conceived in a very impersonal way, with algorithms working autonomously on passively collected data. However, this viewpoint hides considerable human work of tuning the algorithms, gathering the data, and even deciding what should be modeled in the first place. Examining machine learning from a human-centered perspective includes explicitly recognising this human work, as well as reframing machine learning workflows based on situated human working practices, and exploring the co-adaptation of humans and systems. A human-centered understanding of machine learning in human context can lead not only to more usable machine learning tools, but to new ways of framing learning computationally. This workshop will bring together researchers to discuss these issues and suggest future research questions aimed at creating a human-centered approach to machine learning.


human factors in computing systems | 2015

Using Interactive Machine Learning to Support Interface Development Through Workshops with Disabled People

Simon Katan; Mick Grierson; Rebecca Fiebrink

We have applied interactive machine learning (IML) to the creation and customisation of gesturally controlled musical interfaces in six workshops with people with learning and physical disabilities. Our observations and discussions with participants demonstrate the utility of IML as a tool for participatory design of accessible interfaces. This work has also led to a better understanding of challenges in end-user training of learning models, of how people develop personalised interaction strategies with different types of pre-trained interfaces, and of how properties of control spaces and input devices influence peoples customisation strategies and engagement with instruments. This work has also uncovered similarities between the musical goals and practices of disabled people and those of expert musicians.


intelligent user interfaces | 2015

A Model for Data-Driven Sonification Using Soundscapes

KatieAnna Wolf; Genna Gliner; Rebecca Fiebrink

A sonification is a rendering of audio in response to data, and is used in instances where visual representations of data are impossible, difficult, or unwanted. Designing sonifications often requires knowledge in multiple areas as well as an understanding of how the end users will use the system. This makes it an ideal candidate for end-user development where the user plays a role in the creation of the design. We present a model for sonification that utilizes user-specified examples and data to generate cross-domain mappings from data to sound. As a novel contribution we utilize soundscapes (acoustic scenes) for these user-selected examples to define a structure for the sonification. We demonstrate a proof of concept of our model using sound examples and discuss how we plan to build on this work in the future.


human factors in computing systems | 2010

Real-time interaction with supervised learning

Rebecca Fiebrink

My work concerns the design of interfaces for effective interaction with machine learning algorithms in real-time application domains. I am interested in supporting human interaction throughout the entire supervised learning process, including the generation of training examples. In my dissertation research, I seek to better understand how new machine learning interfaces might improve accessibility and usefulness to non-technical users, to further explore how differences between machine learning in practice and machine learning in theory can inform both interface and algorithm design, and to employ new machine learning interfaces for novel applications in real-time music composition and performance.


human factors in computing systems | 2017

Mixed-Initiative Creative Interfaces

Sebastian Deterding; Jonathan Hook; Rebecca Fiebrink; Marco Gillies; Jeremy Gow; Memo Akten; Gillian Smith; Antonios Liapis; Kate Compton

Enabled by artificial intelligence techniques, we are witnessing the rise of a new paradigm of computational creativity support: mixed-initiative creative interfaces put human and computer in a tight interactive loop where each suggests, produces, evaluates, modifies, and selects creative outputs in response to the other. This paradigm could broaden and amplify creative capacity for all, but has so far remained mostly confined to artificial intelligence for game content generation, and faces many unsolved interaction design challenges. This workshop therefore convenes CHI and game researchers to advance mixed-initiative approaches to creativity support.


Archive | 2017

Machine Learning as Meta-Instrument: Human-Machine Partnerships Shaping Expressive Instrumental Creation

Rebecca Fiebrink

In this chapter, I describe how supervised learning algorithms can be used to build new digital musical instruments. Rather than merely serving as methods for inferring mathematical relationships from data, I show how these algorithms can be understood as valuable design tools that support embodied, real-time, creative practices. Through this discussion, I argue that the relationship between instrument builders and instrument creation tools warrants closer consideration: the affordances of a creation tool shape the musical potential of the instruments that are built, as well as the experiences and even the creative aims of the human builder. Understanding creation tools as “instruments” themselves invites us to examine them from perspectives informed by past work on performer-instrument interactions.


human factors in computing systems | 2014

Improving data-driven design and exploration of digital musical instruments

Christopher Patrick Laguna; Rebecca Fiebrink

We present Gesture Mapper, an application for digital musical instrument designers to rapidly prototype mappings from performer gestures to sound synthesis parameters. Prior work [2] has shown that using interactive supervised learning to generate mappings from user-generated examples can be more efficient and effective than users writing mapping functions in code. In this work, we explore new ways to improve on data-driven design of interactive systems, specifically by proposing new mechanisms for rapid exploration and comparison of multiple alternative mappings. We present a conceptual structure for interactive mappings, a basic framework for generating mappings from more diverse types of user-specified constraints than are supported by supervised learning, and the new Gesture Mapper user interface for mapping exploration and comparison.


advanced visual interfaces | 2014

BeatBox: end-user interactive definition and training of recognizers for percussive vocalizations

Kyle Hipke; Michael Toomim; Rebecca Fiebrink; James Fogarty

Interactive end-user training of machine learning systems has received significant attention as a tool for personalizing recognizers. However, most research limits end users to training a fixed set of application-defined concepts. This paper considers additional challenges that arise in end-user support for defining the number and nature of concepts that a system must learn to recognize. We develop BeatBox, a new system that enables end-user creation of custom beatbox recognizers and interactive adaptation of recognizers to an end users technique, environment, and musical goals. BeatBox proposes rapid end-user exploration of variations in the number and nature of learned concepts, and provides end users with feedback on the reliability of recognizers learned for different potential combinations of percussive vocalizations. In a preliminary evaluation, we observed that end users were able to quickly create usable classifiers, that they explored different combinations of concepts to test alternative vocalizations and to refine classifiers for new musical contexts, and that learnability feedback was often helpful in alerting them to potential difficulties with a desired learning concept.

Collaboration


Dive into the Rebecca Fiebrink's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Reid Oda

Princeton University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge