Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Robert Rowe is active.

Publication


Featured researches published by Robert Rowe.


Journal of New Music Research | 2010

Automated Music Emotion Recognition: A Systematic Evaluation

Arefin Huq; Juan Pablo Bello; Robert Rowe

Abstract Automated music emotion recognition (MER) is a challenging task in Music Information Retrieval with wide-ranging applications. Some recent studies pose MER as a continuous regression problem in the Arousal-Valence (AV) plane. These consist of variations on a common architecture having a universal model of emotional response, a common repertoire of low-level audio features, a bag-of-frames approach to audio analysis, and relatively small data sets. These approaches achieve some success at MER and suggest that further improvements are possible with current technology. Our contribution to the state of the art is to examine just how far one can go within this framework, and to investigate what the limitations of this framework are. We present the results of a systematic study conducted in an attempt to maximize the prediction performance of an automated MER system using the architecture described. We begin with a carefully constructed data set, emphasizing quality over quantity. We address affect induction rather than affect attribution. We consider a variety of algorithms at each stage of the training process, from preprocessing to feature selection and model selection, and we report the results of extensive testing. We found that: (1) none of the variations we considered leads to a substantial improvement in performance, which we present as evidence of a limit on what is achievable under this architecture, and (2) the size of the small data sets that are commonly used in the MER literature limits the possibility of improving the set of features used in MER due to the phenomenon of Subset Selection Bias. We conclude with some proposals for advancing the state of the art.


Computer Music Journal | 1992

Machine Listening and Composing with Cypher

Robert Rowe

Cypher is a real-time interactive music system with two major components: a listener and a player. The listener analyzes streams of MIDI data. The player uses various algorithmic techniques to produce new musical output. Both components are made up of many small, interconnected agents, operating on several hierarchical levels. The listener classifies features in the input and their behavior over time, sending messages that communicate this analysis to the player. Features classified include speed, density, dynamic, harmony, and rhythm. A user of Cypher can configure the player component to react to such messages, such that a reaction is the execution of compositional methods producing new music in response. A graphical interface allows the interactive specification of relations between feature classifications and types of response. Collections of relations can be saved, then recalled during performance by a score orientation component that tracks human performance and executes state changes at predetermined points in the score. Furthermore, an internal critic analyzes and alters Cyphers own output, representing a set of programmed aesthetic preferences, and ensuring a consistency of style in the programs responses. Cypher plays composed music in a manner that is sensitive to live human performance cues. It is able to analyze and respond creatively to unknown music. Finally, it can compose without input, using algorithms to either transform remembered material or generate new musical output. The work described here is an applied music theory; ideas about the description and generation of music are formalized to the point that they can be implemented in a computer program and tested in real time. Though textual traces of program activity can provide a detailed written account of the processing performed, the application of theoretical ideas in a real-time context brings the intellectual enterprise to a point of contact with complete musical examples, where the ear can judge the success of the theory.


Music Perception: An Interdisciplinary Journal | 2000

Key Induction in the Context of Interactive Performance

Robert Rowe

Several algorithms for finding the tonal center of a musical context are extant in the literature. For use in interactive music systems, we are interested in algorithms that are fast enough to run in real time and that need only make reference to the material as it appears in sequence. In this article, I examine a number of such algorithms and the ways in which their contribution to real-time algorithmic listening can be bolstered by reference to concurrent analyzers working on other tasks. Though as part of the discussion I review my own key finder, the focus here is on the coordination of published methods using control structures for multiprocess analysis and their application in performance.


Journal of New Music Research | 2015

Five Perspectives on Musical Rhythm

Juan Pablo Bello; Robert Rowe; Carlos Guedes; Godfried T. Toussaint

ISSN: 0929-8215 (Print) 1744-5027 (Online) Journal homepage: https://www.tandfonline.com/loi/nnmr20 Five Perspectives on Musical Rhythm Juan P. Bello, Robert Rowe, Carlos Guedes & Godfried Toussaint To cite this article: Juan P. Bello, Robert Rowe, Carlos Guedes & Godfried Toussaint (2015) Five Perspectives on Musical Rhythm, Journal of New Music Research, 44:1, 1-2, DOI: 10.1080/09298215.2014.996572 To link to this article: https://doi.org/10.1080/09298215.2014.996572


Journal of New Music Research | 2005

Real Time and Unreal Time: Expression in Distributed Performance

Robert Rowe

“Real time” indicates that the actions of a computer system take place at the same time as events in the environment to which the system is responding. The expressivity of an interactive multimedia system is directly related to its real-time operation: the aesthetic qualities of ensemble performance or a call-and-response interplay between human and machine forces are changed dramatically when there is a significant delay to the computers output. At the same time that some artistic opportunities are closed off, however, new ones emerge. In particular, this article will explore the spaces made available through unreal time – those latencies between the immediacy of ensemble interaction and delays so long as to be outside of any kind of “real-time” performance – the range between about 20 and 2,000 milliseconds. A discussion of these latencies, as encountered in a composition staged in distributed performance over Internet2, closes the article.


Archive | 2001

Machine Musicianship

Robert Rowe


Music Perception: An Interdisciplinary Journal | 2009

Emotion in Motion: Investigating the Time-Course of Emotional Judgments of Musical Stimuli

Justin Pierre Bachorik; Marc Bangert; Psyche Loui; Kevin Larke; Jeff Berger; Robert Rowe; Gottfried Schlaug


Archive | 2009

MUSIC CLASSIFICATION SYSTEM AND METHOD

Robert Rowe; Jeff Berger; Juan Pablo Bello; Kevin Larke


international computer music conference | 1997

Two Highly Integrated Real-Time Music and Graphics Performance Systems

Robert Rowe; Eric Singer


Computer Music Journal | 1992

Computer Music Currents 1

Robert Rowe; David Evan Jones; Michel Decoust; Charles Dodge; Jean-Baptiste Barrière; Trevor Wishart; Roger Reynolds

Collaboration


Dive into the Robert Rowe's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gottfried Schlaug

Beth Israel Deaconess Medical Center

View shared research outputs
Top Co-Authors

Avatar

Kevin Larke

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Carlos Guedes

New York University Abu Dhabi

View shared research outputs
Top Co-Authors

Avatar

Godfried T. Toussaint

New York University Abu Dhabi

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge