Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daniel Arfib is active.

Publication


Featured researches published by Daniel Arfib.


IEEE Transactions on Audio, Speech, and Language Processing | 2006

Adaptive digital audio effects (a-DAFx): a new class of sound transformations

Vincent Verfaille; Udo Zölzer; Daniel Arfib

After covering the basics of sound perception and giving an overview of commonly used audio effects (using a perceptual categorization), we propose a new concept called adaptive digital audio effects (A-DAFx). This consists of combining a sound transformation with an adaptive control. To create A-DAFx, low-level and perceptual features are extracted from the input signal, in order to derive the control values according to specific mapping functions. We detail the implementation of various new adaptive effects and give examples of their musical use


Journal of New Music Research | 2005

Expressiveness and Digital Musical Instrument Design

Daniel Arfib; Jean-Michel Couturier; Loïc Kessous

In this article, after giving some possible definitions of “expressiveness”, we examine the problem of expressiveness in digital musical instruments, which tends to involve using specific gestures to obtain an expressive sound rather than performing expressive gestures. Some of the particular features of digital musical instruments, such as pitch control, dynamic control and the possibility of exploring sound palettes, are described and some practical examples given. Finally, several musical implications of the gestures used to obtain musical expressiveness are discussed, from pedagogical and other related points of view.


GW '01 Revised Papers from the International Gesture Workshop on Gesture and Sign Languages in Human-Computer Interaction | 2001

Gestural Control of Sound Synthesis and Processing Algorithms

Daniel Arfib; Loïc Kessous

Computer programs such as MUSIC V or CSOUND lead to a huge number of sound examples, either in the synthesis or in the processing domain. The translation of such algorithms to real-time programs such as MAX-MSP allows these digitally created sounds to be used effectively in performance. This includes interpretation, expressivity, or even improvisation and creativity. This particular bias of our project (from sound to gesture) brings about new questions such as the choice of strategies for gesture control and feedback, as well as the mapping of peripherals data to synthesis and processing data. The learning process is required for these new controls and the issue of virtuosity versus simplicity is an everyday challenge.


International Gesture Workshop | 2003

Design and Use of Some New Digital Musical Instruments

Daniel Arfib; Jean-Michel Couturier; Loïc Kessous

This article presents some facts about the use of gesture in computer music, more specifically in home made instruments dedicated to performance on stage. We first give some theoretical and practical ideas to design an instrument, in the following areas: the sound, the gesture and the mapping. Then, we introduce three examples of digital instruments we have created, focusing on their design and their musical use.


Lecture Notes in Computer Science | 2005

Some experiments in the gestural control of synthesized sonic textures

Daniel Arfib; Jean-Michel Couturier; Jehan-Julien Filatriau

In this paper, we introduce some exploratory ideas and applications involving the gestural control of sonic textures. Three examples of how the gestural control of synthesized textures can be implemented are presented: scratching textures, based on the gesturalized exploration of a visual space; dynamic noise filtering, where gestures influence a virtual slowly moving string used to filter a noise; and breathing textures, where the metaphor of breathing is used in the sound as well as in the gestural control. Lastly, we discuss how to find connexions between appropriate gestures and sonic texture processes, with a view to producing coherent and expressive digital musical instruments.


Journal of New Music Research | 2002

Musical Implications of Digital Audio Effects

Daniel Arfib

of a European action named “digital audio effects”: what is a “digital audio effect?” As this question was at the centre of a “bottom-up” action in the COST framework, the best thing was to see what emerged. From this point of view, the four previous DAFx conferences are a source of incredible creativity and I am curious about the next one, DAFx02, that will take place in Hamburg in September 2002. DAFx has led to so many things and its scope is broad, from music to environmental sounds, physical modelling and hardware implementations. The link between research on signal processing and the art of making musical sounds is really challenging. When a COST G6 management committee decided to look for a theme for a special issue related to music and DAFx, I was certain that we would have many references on “musical applications.” In fact, few authors dared touch this topic directly. So the term “Implications” was adopted, in such a way that a technology can infuse and fertilise a field of creativity, just as music can make technology artistic. This is what this special issue is all about: not just a collection of wishes, but a snapshot of how people in a network can feel, see and act. In April 2001 we decided to look for articles first presented in the DAFx conferences and to ask authors to produce an expanded and improved version of their work. This issue is the result of this endeavour.


Organised Sound | 2002

Strategies of mapping between gesture data and synthesis model parameters using perceptual spaces

Daniel Arfib; Jean-Michel Couturier; Loïc Kessous; Vincent Verfaille


Archive | 2000

TRADITIONAL (?) IMPLEMENTATIONS OF A PHASE-VOCODER: THE TRICKS OF THE TRADE

Amalia De Götzen; Nicola Bernardini; Daniel Arfib


DAFX: Digital Audio Effects, Second Edition | 2002

Time-frequency processing

Daniel Arfib; F. Keiler; U. Zoelzer


new interfaces for musical expression | 2003

Pointing fingers: using multiple direct interactions with visual objects to perform music

Jean-Michel Couturier; Daniel Arfib

Collaboration


Dive into the Daniel Arfib's collaboration.

Top Co-Authors

Avatar

Jean-Michel Couturier

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jehan-Julien Filatriau

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Loïc Kessous

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nathalie Delprat

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

U. Zoelzer

Helmut Schmidt University

View shared research outputs
Top Co-Authors

Avatar

Udo Zölzer

Helmut Schmidt University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge