Luke Dahl
University of Virginia
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Luke Dahl.
Computer Music Journal | 2015
Luke Dahl
Motion-sensing technologies enable musical interfaces where a performer controls sound by moving his or her body “in the air,” without touching a physical object. These interfaces work well when the movement and resulting sound are smooth and continuous, but it has proven difficult to design air instruments that trigger discrete sounds with precision that feels natural to performers and allows them to play rhythmically complex music. This article presents a study of “air drumming” gestures. Participants performed drumming-like gestures in time to simple recorded rhythms. These movements were recorded and examined to look for aspects of the movement that correspond to the timing of the sounds. The goal is to understand what we do with our bodies when we gesture in the air to trigger a sound. Two movement features of the hand are studied: Hits are the moment where the hand changes direction at the end of the striking gesture, and acceleration peaks are sharp peaks in magnitude acceleration as the hand decelerates. Hits and acceleration peaks are also detected for the movement of the wrist. It is found that the acceleration peaks are more useful than the hits because they occur earlier and with less variability, and their timing changes less with note speed. It is also shown that timing differences between hand and wrist features can be used to group performers into different movement styles.
new interfaces for musical expression | 2007
Luke Dahl; Nathan Whetsell; John Van Stoecker
In this paper, we describe a musical controller -- the WaveSaw - for directly manipulating a wavetable. The WaveSaw consists of a long, flexible metal strip with handles on either end, somewhat analogous to a saw. The user plays the WaveSaw by holding the handles and bending the metal strip. We use sensors to measure the strips curvature and reconstruct its shape as a wavetable stored in a computer. This provides a direct gestural mapping from the shape of the WaveSaw to the timbral characteristics of the computer-generated sound. Additional sensors provide control of pitch, amplitude, and other musical parameters.
computer music modeling and retrieval | 2015
Luke Dahl
New air-instruments allow us to control sound by moving our bodies in space without manipulating a physical object. However when we want to trigger a discrete sound at a precise time, for example by making a drumming gesture, the timing feels wrong. This work aims to understand what aspects of a performer’s movement correspond to their subjective sense of when the sound should occur. A study of air-drumming gestures was conducted, and the timing of eight movement events based on movements of the hand, wrist, elbow joint, and wrist joint are examined. In general, it is found that movement events based on peaks in acceleration are better because they occur earlier and have less noise than do events based on changes of direction.
Proceedings of the 5th International Conference on Movement and Computing | 2018
Luke Dahl; Federico Visi
Marker-based motion capture systems that stream precise movement data in real-time afford interaction scenarios that can be subtle, detailed, and immediate. However, challenges to effectively utilizing this data include having to build bespoke processing systems which may not scale well, and a need for higher-level representations of movement and movement qualities. We present modosc, a set of Max abstractions for computing motion descriptors from raw motion capture data in real time. Modosc is designed to address the data handling and synchronization issues that arise when working with complex marker sets, and to structure data streams in a meaningful and easily accessible manner. This is achieved by adopting a multiparadigm programming approach using o.dot and Open Sound Control. We describe an initial set of motion descriptors, the addressing system employed, and design decisions and challenges.
computer music modeling and retrieval | 2015
Florent Berthaut; Luke Dahl
Orchestras of Digital Musical Instruments (DMIs) enable new musical collaboration possibilities, extending those of acoustic and electric orchestras. However the creation and development of these orchestras remain constrained. In fact, each new musical collaboration system or orchestra piece relies on a fixed number of musicians, a fixed set of instruments (often only one), and a fixed subset of possible modes of collaboration. In this paper, we describe a unified framework that enables the design of Digital Orchestras with potentially different DMIs and an expand-able set of collaboration modes. It relies on research done on analysis and classification of traditional and digital orchestras, on research in Collaborative Virtual Environments, and on interviews of musicians and composers. The BOEUF framework consists of a classification of modes of collaboration and a set of components for modelling digital orchestras. Integrating this framework into DMIs will enable advanced musical collaboration modes to be used in any digital orchestra, including spontaneous jam sessions.
user interface software and technology | 2012
Luke Dahl; Sébastien Robaszkiewicz
We investigate the effects of adding structure to musical interactions for novices. A simple instrument allows control of three musical parameters: pitch, timbre, and note density. Two users can play at once, and their actions are visible on a public display. We asked pairs of users to perform duets under two interaction conditions: unstructured, where users are free to play what they like, and structured, where users are directed to different areas of the musical parameter space by time-varying constraints indicated on the display. A control group played two duets without structure, while an experimental group played one duet with structure and a second without. By crowd-sourcing the ranking of recorded duets we find that structure leads to musically better results. A post experiment survey showed that the experimental group had a better experience during the second unstructured duet than during the structured.
new interfaces for musical expression | 2010
Jieun Oh; Jorge Herrera; Nicholas J. Bryan; Luke Dahl; Ge Wang
new interfaces for musical expression | 2010
Luke Dahl; Ge Wang
Archive | 2000
Luke Dahl; Jean-Marc Jot
new interfaces for musical expression | 2011
Luke Dahl; Jorge Herrera; Carr Wilkerson