Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Charles Martin is active.

Publication


Featured researches published by Charles Martin.


computational intelligence | 2018

RoboJam: A Musical Mixture Density Network for Collaborative Touchscreen Interaction

Charles Martin; Jim Torresen

RoboJam is a machine-learning system for generating music that assists users of a touchscreen music app by performing responses to their short improvisations. This system uses a recurrent artificial neural network to generate sequences of touchscreen interactions and absolute timings, rather than high-level musical notes. To accomplish this, RoboJam’s network uses a mixture density layer to predict appropriate touch interaction locations in space and time. In this paper, we describe the design and implementation of RoboJam’s network and how it has been integrated into a touchscreen music app. A preliminary evaluation analyses the system in terms of training, musical generation and user interaction.


Archive | 2016

A Percussion-Focussed Approach to Preserving Touch-Screen Improvisation

Charles Martin; Henry J. Gardner

Musical performances with touch-screen devices can be recorded by capturing a log of touch interactions. This object can serve as an archive or as a basis for other representations of the musical work. This chapter presents a protocol for recording ensemble touch-screen performances and details the processes for generating visualisations, gestural classifications, and graphical scores from these logs. Our experience of using these new representations to study a series of improvised ensemble performances with iPad-based digital musical instruments leads us to conclude that these new-media artefacts allow unique insights into ensemble interactions, comprehensive archiving of improvised performances, and the potential for re-synthesis into new performances and artworks.


genetic and evolutionary computation conference | 2018

Real-world evolution adapts robot morphology and control to hardware limitations

Tønnes F. Nygaard; Charles Martin; Eivind Samuelsen; Jim Torresen; Kyrre Glette

For robots to handle the numerous factors that can affect them in the real world, they must adapt to changes and unexpected events. Evolutionary robotics tries to solve some of these issues by automatically optimizing a robot for a specific environment. Most of the research in this field, however, uses simplified representations of the robotic system in software simulations. The large gap between performance in simulation and the real world makes it challenging to transfer the resulting robots to the real world. In this paper, we apply real world multi-objective evolutionary optimization to optimize both control and morphology of a four-legged mammal-inspired robot. We change the supply voltage of the system, reducing the available torque and speed of all joints, and study how this affects both the fitness, as well as the morphology and control of the solutions. In addition to demonstrating that this real-world evolutionary scheme for morphology and control is indeed feasible with relatively few evaluations, we show that evolution under the different hardware limitations results in comparable performance for low and moderate speeds, and that the search achieves this by adapting both the control and the morphology of the robot.


audio mostly conference | 2017

Deep Models for Ensemble Touch-Screen Improvisation

Charles Martin; Kai Olav Ellefsen; Jim Torresen

For many, the pursuit and enjoyment of musical performance goes hand-in-hand with collaborative creativity, whether in a choir, jazz combo, orchestra, or rock band. However, few musical interfaces use the affordances of computers to create or enhance ensemble musical experiences. One possibility for such a system would be to use an artificial neural network (ANN) to model the way other musicians respond to a single performer. Some forms of music have well-understood rules for interaction; however, this is not the case for free improvisation with new touch-screen instruments where styles of interaction may be discovered in each new performance. This paper describes an ANN model of ensemble interactions trained on a corpus of such ensemble touch-screen improvisations. The results show realistic ensemble interactions and the model has been used to implement a live performance system where a performer is accompanied by the predicted and sonified touch gestures of three virtual players.


Contemporary Music Review | 2017

Percussionist-Centred Design for Touchscreen Digital Musical Instruments

Charles Martin

This article describes how percussive interaction informed the design, development, and deployment of a series of touchscreen digital musical instruments for ensembles. Percussion has previously been defined by techniques for exploring and interacting with instruments, rather than by the instruments themselves. Percussionists routinely co-opt unusual objects as instruments or create them from scratch. In this article, this process is used for the iterative design and evaluation of five mobile music apps by percussion ensembles. The groups helped refine the apps from prototype to performance through research rehearsals where they improvised, explored new musical gestures, and collaborated to develop practical performance strategies. As a result, the affordances and limitations of the apps were discovered, as were a vocabulary of percussive touch gestures. This article argues that this percussionist-centred process was an effective method for developing musical apps, and that it could be applied more widely in designing musical computer systems.


human factors in computing systems | 2016

Intelligent Agents and Networked Buttons Improve Free-Improvised Ensemble Music-Making on Touch-Screens

Charles Martin; Henry J. Gardner; Ben Swift; Michael A. Martin

We present the results of two controlled studies of free-improvised ensemble music-making on touch-screens. In our system, updates to an interface of harmonically-selected pitches are broadcast to every touch-screen in response to either a performer pressing a GUI button, or to interventions from an intelligent agent. In our first study, analysis of survey results and performance data indicated significant effects of the button on performer preference, but of the agent on performance length. In the second follow-up study, a mixed-initiative interface, where the presence of the button was interlaced with agent interventions, was developed to leverage both approaches. Comparison of this mixed-initiative interface with the always-on button-plus-agent condition of the first study demonstrated significant preferences for the former. The different approaches were found to shape the creative interactions that take place. Overall, this research offers evidence that an intelligent agent and a networked GUI both improve aspects of improvised ensemble music-making.


human factors in computing systems | 2014

Exploring percussive gesture on iPads with ensemble metatone

Charles Martin; Henry J. Gardner; Benjamin Swift


new interfaces for musical expression | 2015

Tracking Ensemble Performance on Touch-Screens with Gesture Classification and Transition Matrices

Charles Martin; Henry J. Gardner; Ben Swift


Archive | 2015

That Syncing Feeling: Networked Strategies for Enabling Ensemble Creativity in iPad Musicians

Charles Martin; Henry J. Gardner


Music of 18 Performances: Evaluating Apps and Agents with Free Improvisation | 2015

MUSIC OF 18 PERFORMANCES: EVALUATING APPS AND AGENTS WITH FREE IMPROVISATION

Charles Martin; Henry J. Gardner; Ben Swift; Michael A. Martin

Collaboration


Dive into the Charles Martin's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Henry J. Gardner

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Ben Swift

Australian National University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Benjamin Swift

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Michael A. Martin

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Kai Olav Ellefsen

Norwegian University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge