Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Micheline Lesaffre is active.

Publication


Featured researches published by Micheline Lesaffre.


Journal of New Music Research | 2005

Prediction of Musical Affect Using a Combination of Acoustic Structural Cues

Marc Leman; Valery Vermeulen; Liesbeth De Voogdt; Dirk Moelants; Micheline Lesaffre

This study explores whether musical affect attribution can be predicted by a linear combination of acoustical structural cues. To that aim, a database of sixty musical audio excerpts was compiled and analyzed at three levels: judgments of affective content by subjects; judgments of structural content by musicological experts (i.e., “manual structural cues”), and extraction of structural content by an auditory-based computer algorithm (called: acoustical structural cues). In Study I, an affect space was constructed with Valence (gay-sad), Activity (tender-bold) and Interest (exciting-boring) as the main dimensions, using the responses of a hundred subjects. In Study II manual and acoustical structural cues were analyzed and compared. Manual structural cues such as loudness and articulation could be accounted for in terms of a combination of acoustical structural cues. In Study III, the subjective responses of eight individual subjects were analyzed using the affect space obtained in Study I, and modeled in terms of the structural cues obtained in Study II, using linear regression modeling. This worked better for the Activity dimension than for the Valence dimension, while the Interest dimension could not be accounted for. Overall, manual structural cues worked better than acoustical structural cues. In a final assessment study, a selected set of acoustical structural cues was used for building prediction models. The results indicate that musical affect attribution can partly be predicted using a combination of acoustical structural cues. Future research may focus on non-linear approaches, elaboration of dataset and subjects, and refinement of acoustical structural cue extraction.


Signal Processing | 2010

Access to ethnic music: Advances and perspectives in content-based music information retrieval

Olmo Cornelis; Micheline Lesaffre; Dirk Moelants; Marc Leman

Access to digital music collections is nowadays facilitated by content-based methods that allow the retrieval of music on the basis of intrinsic properties of audio, in addition to advanced metadata processing. However, access to ethnic music remains problematic, as this music does not always correspond to the Western concepts that underlie the currently available content-based methods. In this paper, we examine the literature on access to ethnic music, while focusing on the reasons why the existing techniques fail or fall short of expectations and what can be done about it. The paper considers a review of the work on signals and feature extraction, on symbolic and semantic information processing, and on metadata and context tools. An overview is given of several European ethnic music archives and related ongoing research projects. Problems are highlighted and suggestions of the ways in which to improve access to ethnic music collections are given.


Journal of New Music Research | 2012

Assessing a clarinet player's performer gestures in relation to locally intended musical targets

Frank Desmet; Luc Nijs; Michiel Demey; Micheline Lesaffre; Jean-Pierre Martens; Marc Leman

Abstract Musicianship is known to display high-level skills, which involve different aspects of mental processing and corporeal control. Of particular interest is the match between the musicians mental focus on musical targets (the so-called musical intentions) and the expressive (or so-called auxiliary) body movements. To what extent are these related to each other? And what does this relationship reveal about mind–body connections? To approach these questions, a case study was set up around a clarinet solo performance played from score, covering a style of music unfamiliar to the player. The clarinetists movements were recorded with an optical movement tracking system. A statistical analysis method was developed, to account for movement data in relation to the potential musical intentions and targets. The bottom-up movement analysis method was validated with the performers annotations of targets in the musical score and the performers annotations of communicative/sound facilitating gestures in the performance video. The results reveal that the mental focus on musical targets is related to bodily expression. This finding supports the idea of an embodied model of musical syntax processing, which is strongly related to corporeal gestures.


Interacting with Computers | 2012

Interacting with the Music Paint Machine: Relating the constructs of flow experience and presence

Luc Nijs; Pieter Coussement; Bart Moens; Denis Amelinck; Micheline Lesaffre; Marc Leman

In this paper we report on the results of an experiment on the experience of flow and presence while engaging with an interactive music system, the Music Paint Machine. This music system provides a game-like environment in which a musician can create a digital painting by playing an acoustic musical instrument, by moving the body in different directions, and by selecting colours using a pressure mat. The experiment aimed at getting a better insight into the possible relationship between flow experience and presence. Based on the definition of flow as a combination of the highest level of presence (presence-as-feeling) and a positive emotional state (Riva et al., 2004a), we hypothesized that presence has a predictive value for flow. Sixty-five musicians, both amateur and professional, participated in the experiment. Flow experience was measured with the Flow State Scale (Jackson and Eklund, 2004). Presence was measured with an in-house designed presence questionnaire. Results showed a significantly strong correlation between flow and presence. Moreover, the scores for presence significantly predicted the Flow State Scale, and explained a significant proportion of variance in the Flow State Scale. Furthermore, many significant associations were found between flow and presence variables, among which the most significant were the strong correlation (Spearmans rank) between the naturalness of using the system and the Flow State Scale and between the feeling of non-mediation and the Flow State Scale.


computational science and engineering | 2009

Concepts, Technology, and Assessment of the Social Music Game "Sync-in-Team'

Marc Leman; Michiel Demey; Micheline Lesaffre; Leon van Noorden; Dirk Moelants

Music offers an excellent domain in which advanced forms of non-verbal communication can be explored. The first part of this paper introduces the research concepts behind the idea of a social interactive music game, which is based on the no-tions of ‘embodiment’ and ‘mediation technology’. The second part reviews the development of the ‘Sync-in-Team’ game, and its assessment in four different settings, including noisy eco-logical settings. The third part reviews the technological backbone of the game, and the fourth part discusses further developments. A user-oriented approach, based on concepts from embodied music cognition, may offer a valid contribution to the development of novel music-driven games that foster the sense for social interaction, body movement, collaboration, and competition.


Journal of New Music Research | 1999

Automatic Harmonic Description of Musical Signals Using Schema-based Chord Decomposition

F Carreras; Marc Leman; Micheline Lesaffre

This paper presents a model for the harmonic description of musical signals using schema-based chord decomposition. The model is based (i) on a bottom-up processing of musical signals into auditory images of different kinds and (ii) on a schema-based top-down processing of the images which produces chord decomposition. The schema is carried by a neural network and its content is derived from the self-organization of a pre-selected set of chords in all keys. The system architecture of this model is described and its performance is analyzed in detail through several excerpts of piano pieces. An evaluation procedure has been developed to compare the results with the ones derived using a music theoretical approach to chord decomposition.


Journal of New Music Research | 2012

The Music Paint Machine: Stimulating Self-monitoring Through the Generation of Creative Visual Output Using a Technology-enhanced Learning Tool

Luc Nijs; Bart Moens; Micheline Lesaffre; Marc Leman

Abstract In this paper, we discuss the pedagogically grounded and research-based design of a technology-enhanced learning tool, the Music Paint Machine. This interactive music system introduces a musical experience in which the musician creates a digital painting by playing an acoustic musical instrument and by moving the body on a coloured pressure mat. As a learning tool it aims at the development of musical creativity, at the stimulation of embodied understanding of music and at the development of an intimate relationship with the musical instrument. First, the methodological approach is outlined and pedagogical and theoretical backgrounds are discussed. Then, we report on an experiment in which 51 amateur musicians participated. The experiment aimed at probing the applications potential to induce a flow experience and to learn about how participants evaluate the didactic relevance of the Music Paint Machine. Results suggest that the Music Paint Machine has the potential to evoke a flow experience. Furthermore participants acknowledged its didactic relevance with regard to learning to improvise, to developing understanding of musical parameters and to stimulating creativity.


Musicae Scientiae | 2010

User-Oriented Studies in Embodied Music Cognition Research

Marc Leman; Micheline Lesaffre; Luc Nijs; Alexander Deweppe

Music research aims at developing a research space that, in a proactive sense, can support the development of creative and cultural industries. In that context, we argue that a focus on users and their experiences in using tools may become more important in music research. It implies an expansion of the traditional methods of music psychology with methods that can address relevant features of musical action and musical tool use. This paper discusses epistemological and methodological issues related to this development.


international conference on acoustics, speech, and signal processing | 2004

Recent improvements of an auditory model based front-end for the transcription of vocal queries

T. De Mulder; Jean-Pierre Martens; Micheline Lesaffre; Marc Leman; B. De Baets; H. De Meyer

In this paper recent improvements of an existing acoustic frontend for the transcription of vocal (hummed, sung) musical queries is presented. Thanks to the addition of a new second pitch extractor and the introduction of a novel multi-stage segmentation algorithm, the application domain of the front-end could be extended to whistled queries, and on top of that, the performance on the other two query types could be improved. Experiments have shown that the new system can transcribe vocal queries with an accuracy ranging from 76 % (whistling) to 85 % (humming), and that it clearly outperforms other state-of-the art systems on all three query types.


International Journal of Human-computer Interaction | 2013

The “Conducting Master”: An Interactive, Real-Time Gesture Monitoring System Based on Spatiotemporal Motion Templates

Pieter-Jan Maes; Denis Amelynck; Micheline Lesaffre; Marc Leman; D. K. Arvind

Research in the field of embodied music cognition has shown the importance of coupled processes of body activity (action) and multimodal representations of these actions (perception) in how music is processed. Technologies in the field of human–computer interaction (HCI) provide excellent means to intervene into, and extend, these coupled action-perception processes. In this article this model is applied to a concrete HCI application, called the “Conducting Master.” The application facilitates multiple users to interact in real time with the system in order to explore and learn how musical meter can be articulated into body movements (i.e., meter-mimicking gestures). Techniques are provided to model and automatically recognize these gestures in order to provide multimodal feedback streams back to the users. These techniques are based on template-based methods that allow approaching meter-mimicking gestures explicitly from a spatiotemporal account. To conclude, some concrete setups are presented in which the functionality of the Conducting Master was evaluated.

Collaboration


Dive into the Micheline Lesaffre's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge