Alexander Refsum Jensenius
University of Oslo
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Alexander Refsum Jensenius.
Lecture Notes in Computer Science | 2005
Rolf Inge Godøy; Egil Haga; Alexander Refsum Jensenius
Both musicians and non-musicians can often be seen making sound-producing gestures in the air without touching any real instruments. Such “air playing” can be regarded as an expression of how people perceive and imagine music, and studying the relationships between these gestures and sound might contribute to our knowledge of how gestures help structure our experience of music.
Musical gestures : sound, movement, and meaning | 2010
Alexander Refsum Jensenius; Marcelo M. Wanderley; Rolf Inge Godøy; Marc Leman
In the last decade, cognitive science underwent a change of paradigm by bringing human movement into the focus of research. Concepts such as ‘embodiment’ and ‘enactive’ have been proposed as core concepts reflecting the role of the human body in complex processes such as action and perception, and the interaction of mind and physical environment (Varela et al., 1991; Noë, 2004). In music research, human movement has often been related with the notion of gesture. The reason is that many musical activities (performance, conducting, dancing) involve body movements that evoke meanings, and therefore these movements are called gestures. In Camurri et al. (2005), musical gestures are addressed from the viewpoint of their expressive character. However, there are many ways in which music-related body movements can be approached, measured, described and applied. Accordingly, there are many ways in which musical gestures are meaningful. Given the different contexts in which gestures appear, and their close relationship to movement and meaning, one may be tempted to say that the notion of gesture is too broad, ill-defined and perhaps too vague. Yet the use of this notion is very convenient in modern music research, because it allows making a bridge between movement and meaning. A closer look at the term gesture reveals its potential as a core notion that provides access to central issues in action/perception processes and in mind/environment interactions.
Developmental Medicine & Child Neurology | 2010
Lars Adde; Jorunn L. Helbostad; Alexander Refsum Jensenius; Gunnar Taraldsen; Kristine Hermansen Grunewaldt; Ragnhild Støen
Aim The aim of this study was to investigate the predictive value of a computer‐based video analysis of the development of cerebral palsy (CP) in young infants.
Acta Acustica United With Acustica | 2010
Rolf Inge Godøy; Alexander Refsum Jensenius; Kristian Nymoen
In our own and other research on music-related actions, findings suggest that perceived action and sound are broken down into a series of chunks in peoples minds when they perceive or imagine music. Chunks are here understood as holistically conceived and perceived fragments of action and sound, typically with durations in the 0.5 to 5 seconds range. There is also evidence suggesting the occurrence of coarticulation within these chunks, meaning the fusion of small-scale actions and sounds into more superordinate actions and sounds. Various aspects of chunking and coarticulation are discussed in view of their role in the production and perception of music, and it is suggested that coarticulation is an integral element of music and should be more extensively explored in the future.
computational science and engineering | 2012
Tobias Becker; Andreas Agne; Peter R. Lewis; Rami Bahsoon; Funmilade Faniyi; Lukas Esterle; Ariane Keller; Arjun Chandra; Alexander Refsum Jensenius; Stephan C. Stilkerich
Modern compute systems continue to evolve towards increasingly complex, heterogeneous and distributed architectures. At the same time, functionality and performance are no longer the only aspects when developing applications for such systems, and additional concerns such as flexibility, power efficiency, resource usage, reliability and cost are becoming increasingly important. This does not only raise the question of how to efficiently develop applications for such systems, but also how to cope with dynamic changes in the application behaviour or the system environment. The EPiCS Project aims to address these aspects through exploring self-awareness and self-expression. Self-awareness allows systems and applications to gather and maintain information about their current state and environment, and reason about their behaviour. Self-expression enables systems to adapt their behaviour autonomously to changing conditions. Innovations in EPiCS are based on systematic integration of research in concepts and foundations, customisable hardware/software platforms and operating systems, and self-aware networking and middleware infrastructure. The developed technologies are validated in three application domains: computational finance, distributed smart cameras and interactive mobile media systems.
tests and proofs | 2013
Kristian Nymoen; Rolf Inge Godøy; Alexander Refsum Jensenius; Jim Torresen
Links between music and body motion can be studied through experiments called sound-tracing. One of the main challenges in such research is to develop robust analysis techniques that are able to deal with the multidimensional data that musical sound and body motion present. The article evaluates four different analysis methods applied to an experiment in which participants moved their hands following perceptual features of short sound objects. Motion capture data has been analyzed and correlated with a set of quantitative sound features using four different methods: (a) a pattern recognition classifier, (b) t-tests, (c) Spearmans ρ correlation, and (d) canonical correlation. This article shows how the analysis methods complement each other, and that applying several analysis techniques to the same data set can broaden the knowledge gained from the experiment.
human factors in computing systems | 2013
Frédéric Bevilacqua; Sidney S. Fels; Alexander Refsum Jensenius; Michael J. Lyons; Norbert Schnell; Atau Tanaka
This SIG intends to investigate the ongoing dialogue between music technology and the field of human-computer interaction. Our specific aims are to consider major findings of musical interface research over recent years and discuss how these might best be conveyed to CHI researchers interested but not yet active in this area, as well as to consider how to stimulate future collaborations between music technology and CHI research communities. \
Leonardo | 2013
Alexander Refsum Jensenius
ABSTRACT This paper presents an overview of techniques for creating visual displays of human body movement based on video recordings. First a review of early movement and video visualization techniques is given. Then follows an overview of techniques that the author has developed and used in the study of music-related body movements: motion history images, motion average images, motion history keyframe images and motiongrams. Finally, examples are given of how such visualization techniques have been used in empirical music research, in medical research and for creative applications.
computer music modeling and retrieval | 2005
Alexander Refsum Jensenius; Rodolphe Koehly; Marcelo M. Wanderley
This paper presents our work on building low-cost music controllers intended for educational and creative use. The main idea was to build an electronic music controller, including sensors and a sensor interface, on a “10 euro” budget. We have experimented with turning commercially available USB game controllers into generic sensor interfaces, and making sensors from cheap conductive materials such as latex, ink, porous materials, and video tape. Our prototype controller, the CheapStick, is comparable to interfaces built with commercially available sensors and interfaces, but at a fraction of the price.
Physiotherapy Theory and Practice | 2013
Lars Adde; Jorunn L. Helbostad; Alexander Refsum Jensenius; Mette Langaas; Ragnhild Støen
This study evaluates the role of postterm age at assessment and the use of one or two video recordings for the detection of fidgety movements (FMs) and prediction of cerebral palsy (CP) using computer vision software. Recordings between 9 and 17 weeks postterm age from 52 preterm and term infants (24 boys, 28 girls; 26 born preterm) were used. Recordings were analyzed using computer vision software. Movement variables, derived from differences between subsequent video frames, were used for quantitative analysis. Sensitivities, specificities, and area under curve were estimated for the first and second recording, or a mean of both. FMs were classified based on the Prechtl approach of general movement assessment. CP status was reported at 2 years. Nine children developed CP of whom all recordings had absent FMs. The mean variability of the centroid of motion (CSD) from two recordings was more accurate than using only one recording, and identified all children who were diagnosed with CP at 2 years. Age at assessment did not influence the detection of FMs or prediction of CP. The accuracy of computer vision techniques in identifying FMs and predicting CP based on two recordings should be confirmed in future studies.