Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gil Weinberg is active.

Publication


Featured researches published by Gil Weinberg.


Computer Music Journal | 2005

Interconnected Musical Networks: Toward a Theoretical Framework

Gil Weinberg

23 This article attempts to define and classify the aesthetic and technical principles of interconnected musical networks. It presents an historical overview of technological innovations that were instrumental for the development of the field and discusses a number of paradigmatic musical networks that are based on these technologies. A classification of online and local-area musical networks then leads to an attempt to define a taxonomical and theoretical framework for musical interconnectivity, addressing goals and motivations, social organizations and perspectives, network architectures and topologies, and musical content and control. The article concludes with a number of design suggestions for the development of effective interconnected musical networks.


human factors in computing systems | 2006

Robot-human interaction with an anthropomorphic percussionist

Gil Weinberg; Scott Driscoll

The paper presents our approach for human-machine interaction with an anthropomorphic mechanical percussionist that can listen to live players, analyze perceptual musical aspects in real-time, and use the product of this analysis to play along in a collaborative manner. Our robot, named Haile, is designed to combine the benefits of computational power, perceptual modeling, and algorithmic music with the richness, visual interactivity, and expression of acoustic playing. We believe that when interacting with live players, Haile can facilitate a musical experience that is not possible by any other means, inspiring users to collaborate with it in novel and expressive manners. Haile can, therefore, serve a test-bed for novel forms of musical human-machine interaction, bringing perceptual aspects of computer music into the physical world both visually and acoustically.


Autonomous Robots | 2011

Interactive improvisation with a robotic marimba player

Guy Hoffman; Gil Weinberg

Shimon is a interactive robotic marimba player, developed as part of our ongoing research in Robotic Musicianship. The robot listens to a human musician and continuously adapts its improvisation and choreography, while playing simultaneously with the human. We discuss the robot’s mechanism and motion-control, which uses physics simulation and animation principles to achieve both expressivity and safety. We then present an interactive improvisation system based on the notion of physical gestures for both musical and visual expression. The system also uses anticipatory action to enable real-time improvised synchronization with the human player.We describe a study evaluating the effect of embodiment on one of our improvisation modules: antiphony, a call-and-response musical synchronization task. We conducted a 3×2 within-subject study manipulating the level of embodiment, and the accuracy of the robot’s response. Our findings indicate that synchronization is aided by visual contact when uncertainty is high, but that pianists can resort to internal rhythmic coordination in more predictable settings. We find that visual coordination is more effective for synchronization in slow sequences; and that occluded physical presence may be less effective than audio-only note generation.Finally, we test the effects of visual contact and embodiment on audience appreciation. We find that visual contact in joint Jazz improvisation makes for a performance in which audiences rate the robot as playing better, more like a human, as more responsive, and as more inspired by the human. They also rate the duo as better synchronized, more coherent, communicating, and coordinated; and the human as more inspired and more responsive.


international conference on robotics and automation | 2010

Gesture-based human-robot Jazz improvisation

Guy Hoffman; Gil Weinberg

We present Shimon, an interactive improvisational robotic marimba player, developed for research in Robotic Musicianship. The robot listens to a human musician and continuously adapts its improvisation and choreography, while playing simultaneously with the human. We discuss the robots mechanism and motion-control, which uses physics simulation and animation principles to achieve both expressivity and safety. We then present a novel interactive improvisation system based on the notion of gestures for both musical and visual expression. The system also uses anticipatory beat-matched action to enable real-time synchronization with the human player. Our system was implemented on a full-length human-robot Jazz duet, displaying highly coordinated melodic and rhythmic human-robot joint improvisation. We have performed with the system in front of a live public audience


human factors in computing systems | 2010

Shimon: an interactive improvisational robotic marimba player

Guy Hoffman; Gil Weinberg

Shimon is an autonomous marimba-playing robot designed to create interactions with human players that lead to novel musical outcomes. The robot combines music perception, interaction, and improvisation with the capacity to produce melodic and harmonic acoustic responses through choreographic gestures. We developed an anticipatory action framework, and a gesture-based behavior system, allowing the robot to play improvised Jazz with humans in synchrony, fluently, and without delay. In addition, we built an expressive non-humanoid head for musical social communication. This paper describes our system, used in a performance and demonstration at the CHI 2010 Media Showcase.


human-robot interaction | 2007

The interactive robotic percussionist: new developments in form, mechanics, perception and interaction design

Gil Weinberg; Scott Driscoll

We present new developments in the improvisational robotic percussionist project, aimed at improving human-robot interaction through design, mechanics, and perceptual modeling. Our robot, named Haile, listens to live human players, analyzes perceptual aspects in their playing in real-time, and uses the product of this analysis to play along in a collaborative and improvisatory manner. It is designed to combine the benefits of computational power in algorithmic music with the expression and visual interactivity of acoustic playing. Hailes new features include an anthropomorphic form, a linear-motor based robotic arm, a novel perceptual modeling implementation, and a number of new interaction schemes. The paper begins with an overview of related work and a presentation of goals and challenges based on Hailes original design. We then describe new developments in physical design, mechanics, perceptual implementation, and interaction design, aimed at improving human-robot interactions with Haile. The paper concludes with a description of a user study, conducted in an effort to evaluate the new functionalities and their effectiveness in facilitating expressive musical human-robot interaction. The results of the study show correlation between humans and Hailes rhythmic perception as well as user satisfaction regarding Hailes perceptual and mechanical abilties. The study also indicates areas for improvement such as the need for better timbre and loudness control and more advance and responsive interaction schemes.


robot and human interactive communication | 2007

The Design of a Perceptual and Improvisational Robotic Marimba Player

Gil Weinberg; Scott Driscoll

The paper presents the theoretical background and the design scheme for a perceptual and improvisational robotic marimba player that interacts with human musicians in a visual and acoustic manner. Informed by an evaluation of a previously developed robotic percussionist, we present the extension of our work to melodic and harmonic realms with the design of a robotic player that listens to, analyzes and improvises pitch-based musical materials. After a discussion of the motivation for the project, theoretical background and related work, we present a set of research questions followed by our hardware and software approaches designed to address these questions. The paper concludes with a description of our plans to implement and embed these approaches in the robotic marimba player that will be used in workshops and concerts.


robot and human interactive communication | 2005

Musical interactions with a perceptual robotic percussionist

Gil Weinberg; Scott Driscoll; R. Mitchell Parry

We present our approach for human-robot musical interaction using a perceptual and socially-oriented robotic percussionist. Our robot, named Haile, listens to live players, analyzes perceptual musical aspects in real-time, and uses the product of this analysis to playback in an acoustically rich manner, forming musical collaborations with human players. We conclude by proposing guidelines for pedagogy and an educational environment for learning music, math, acoustics, and programming through interaction with Haile.


human-robot interaction | 2009

Interactive jamming with Shimon: a social robotic musician

Gil Weinberg; Aparna Raman; Trishul Mallikarjuna

The paper introduces Shimon: a socially interactive and improvisational robotic marimba player. It presents the interaction schemes used by Shimon in the realization of an interactive musical jam session among human and robotic musicians.


computer music modeling and retrieval | 2008

A Real-Time Genetic Algorithm in Human-Robot Musical Improvisation

Gil Weinberg; Mark Godfrey; Alex Rae; John Rhoads

The paper describes an interactive musical system that utilizes a genetic algorithm in an effort to create inspiring collaborations between human musicians and an improvisatory robotic xylophone player. The robot is designed to respond to human input in an acoustic and visual manner, evolving a human-generated phrase population based on a similarity driven fitness function in real time. The robot listens to MIDI and audio input from human players and generates melodic responses that are informed by the analyzed input as well as by internalized knowledge of contextually relevant material. The paper describes the motivation for the project, the hardware and software design, two performances that were conducted with the system, and a number of directions for future work.

Collaboration


Dive into the Gil Weinberg's collaboration.

Top Co-Authors

Avatar

Mason Bretan

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Scott Driscoll

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Travis Thatcher

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Mark Godfrey

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ryan Nikolaidis

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Brian Blosser

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jason Freeman

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Aaron Albin

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Alex Rae

Georgia Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge