Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ajay Kapur is active.

Publication


Featured researches published by Ajay Kapur.


affective computing and intelligent interaction | 2005

Gesture-Based affective computing on motion capture data

Asha Kapur; Ajay Kapur; Naznin Virji-Babul; George Tzanetakis; Peter F. Driessen

This paper presents research using full body skeletal movements captured using video-based sensor technology developed by Vicon Motion Systems, to train a machine to identify different human emotions. The Vicon system uses a series of 6 cameras to capture lightweight markers placed on various points of the body in 3D space, and digitizes movement into x, y, and z displacement data. Gestural data from five subjects was collected depicting four emotions: sadness, joy, anger, and fear. Experimental results with different machine learning techniques show that automatic classification of this data ranges from 84% to 92% depending on how it is calculated. In order to put these automatic classification results into perspective a user study on the human perception of the same data was conducted with average classification accuracy of 93%.


Organised Sound | 2005

Interactive Network Performance: a dream worth dreaming?

Ajay Kapur; Ge Wang; Philip Davidson; Perry R. Cook

This paper questions and examines the validity and future of interactive network performance. The history of research in the area is described as well as experiments with our own system. Our custom-built networked framework, known as GIGAPOPR, transfers high-quality audio, video and MIDI data over a network connection to enable live musical performances to occur in two or more distinct locations. One of our first sensor-augmented Indian instruments, The Electronic Dholak (EDholak) is a multi-player networked percussion controller that is modelled after the traditional Indian Dholak. The EDholaks trigger sound, including samples and physical models, and visualisation, using our custom-built networked visualisation software, known as veldt.


Journal of New Music Research | 2003

The Electronic Tabla Controller

Ajay Kapur; Georg Essl; Philip Davidson; Perry R. Cook

This paper describes the design of an electronic Tabla controller (ETabla). Tabla are a pair of hand drums traditionally used to accompany North Indian vocal and instrumental music. The ETabla controls both sound and graphics simultaneously. It allows for a variety of traditional Tabla strokes and new performance techniques. Graphical feedback allows for artistic display and has potential pedagogical applications. This paper describes the evolution of the technology of the Tabla from its origins until the present day; the traditional playing style of the Tabla, on which the controller is modeled; the creation of a real-time Tabla controller, using force-sensors; the physical modeling of the sound of the Tabla using banded waveguide synthesis; the creation of a real-time graphics feedback system that reacts to the Tabla controller; experiments on measuring the response time of the ETabla sensors; and the description of the ETabla used in a live performance.


new interfaces for musical expression | 2007

Integrating hyperinstruments, musical robots & machine musicianship for North Indian classical music

Ajay Kapur; Eric Singer; Manjinder Singh Benning; George Tzanetakis; Trimpin

This paper describes a system enabling a human to perform music with a robot in real-time, in the context of North Indian classical music. We modify a traditional acoustic sitar into a hyperinstrument in order to capture performance gestures for musical analysis. A custom built four-armed robotic Indian drummer was built using a microchip, solenoids, aluminum and folk frame drums. Algorithms written towards intelligent machine musicianship are described. The final goal of this research is to have a robotic drummer accompany a professional human sitar player live in performance.


IEEE Transactions on Multimedia | 2011

Training Surrogate Sensors in Musical Gesture Acquisition Systems

Adam R. Tindale; Ajay Kapur; George Tzanetakis

Capturing the gestures of music performers is a common task in interactive electroacoustic music. The captured gestures can be mapped to sounds, synthesis algorithms, visuals, etc., or used for music transcription. Two of the most common approaches for acquiring musical gestures are: 1) “hyper-instruments” which are “traditional” musical instruments enhanced with sensors for directly detecting the gestures and 2) “indirect acquisition” in which the only sensor is a microphone capturing the audio signal. Hyper-instruments require invasive modification of existing instruments which is frequently undesirable. However, they provide relatively straightforward and reliable sensor measurements. On the other hand, indirect acquisition approaches typically require sophisticated signal processing and possibly machine learning algorithms in order to extract the relevant information from the audio signal. The idea of using direct sensor(s) to train a machine learning model for indirect acquisition is proposed in this paper. The resulting trained “surrogate” sensor can then be used in place of the original direct invasive sensor(s) that were used for training. That way, the instrument can be used unmodified in performance while still providing the gesture information that a hyper-instrument would provide. In addition, using this approach, large amounts of training data can be collected with minimum effort. Experimental results supporting this idea are provided in two detection contexts: 1) strike position on a drum surface and 2) strum direction on a sitar.


pacific rim conference on communications, computers and signal processing | 2005

Wearable sensors for real-time musical signal processing

Ajay Kapur; Eric L. Yang; Adam R. Tindale; Peter F. Driessen

This paper describes the use of wearable sensor technology to control parameters of audio effects for real-time musical signal processing. Traditional instrument performance techniques are preserved while the system modifies the resulting sound based upon the movements of the performer. Gesture data from a performing artist is captured using three-axis accelerometer packages that is converted to MIDI (musical instrument digital interface) messages using microcontroller technology. ChucK, a new programming language for on-the-fly audio signal processing and sound synthesis, is used to collect and process synchronized gesture data and audio signals from the traditional instrument being performed. Case studies using the wearable sensors in a variety of locations on the body (head, hands, feet, etc.) with a number of different traditional instruments (tabla, sitar, drumset, turntables, etc.) are presented.


multimedia signal processing | 2005

Subband-based Drum Transcription for Audio Signals

George Tzanetakis; Ajay Kapur; Richard I. McWalter

Content-based analysis of music can help manage the increasing amounts of music information available digitally and is becoming an important part of multimedia research. The use of drums and percussive sounds is pervasive to popular and world music. In this paper we describe an automatic system for detecting and transcribing low and medium-high frequency drum events from audio signals. Two different subband front-ends are utilized. The first is based on bandpass filters and the second is based on wavelet analysis. Experimental results utilizing music, drum loops and Indian tabla thekas as signals are provided. The proposed system can be used as a preprocessing step for rhythm-based music classification and retrieval. In addition it can be used for pedagogical purposes


Journal of New Music Research | 2005

Preservation and extension of traditional techniques:Digitizing North Indian performance

Ajay Kapur; Philip Davidson; Perry R. Cook; W. Andrew Schloss; Peter F. Driessen

Abstract This article describes systems for capturing gestures from a performing artist playing North Indian instruments. Modified traditional instruments use sensor technology and microcontrollers to digitize performance, enabling a computer to synthesize sound and generate visual meaning. Specifically, systems were built to capture data from three traditional North Indian instruments: the tabla (a pair of tonal hand drums), the dholak (a barrel-shaped folk drum played by two people) and the sitar (a 19-stringed, gourd-shelled instrument). The article discusses how these instruments are modified to capture gestural movement, how these signals are mapped to sounds and graphical feedback, and gives examples of the new instruments being used in live performance. Modified performance techniques with the aid of a laptop computer are introduced; however, the hardware is built to try and preserve the techniques passed down from generations of tradition.


multimedia signal processing | 2007

Multimodal Sensor Analysis of Sitar Performance: Where is the Beat?

Manjinder Singh Benning; Ajay Kapur; Bernie C. Till; George Tzanetakis

In this paper we describe a system for detecting the tempo of sitar performance using a multimodal signal processing approach. Real-time measurements are obtained from sensors on the instrument and by wearable sensors on the performers body. Experiments comparing audio-based and sensor-based tempo tracking are described. The real-time tempo tracking method is based on extracting onsets and applying Kalman filtering. We show how late fusion of the audio and sensor tempo estimates can improve tracking. The obtained results are used to inform design parameters for a real-time system for human-robot musical performance.


pacific rim conference on communications, computers and signal processing | 2007

A Comparative Study on Wearable Sensors for Signal Processing on the North Indian Tabla

Manjinder Singh Benning; Ajay Kapur; Bernie C. Till; George Tzanetakis; Peter F. Driessen

This paper describes experimentation using a variety of sensor techniques to capture body gestures and train a student performing the North Indian hand drums known as the Tabla. A comparative study of motion capture systems, wearable accelerometer units, and wireless inertial sensor packages, is described. Each acquisition method has it advantages and disadvantages which are explored through trial and error. The paper describes a number of applications using real-time signal processing techniques for analysis, performance, performer posture detection and machine perception of human interaction.

Collaboration


Dive into the Ajay Kapur's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge