Mason Bretan
Georgia Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mason Bretan.
International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 2015
Mason Bretan; Guy Hoffman; Gil Weinberg
For social robots to respond to humans in an appropriate manner, they need to use apt affect displays, revealing underlying emotional intelligence. We present an artificial emotional intelligence system for robots, with both a generative and a perceptual aspect. On the generative side, we explore the expressive capabilities of an abstract, faceless, creature-like robot, with very few degrees of freedom, lacking both facial expressions and the complex humanoid design found often in emotionally expressive robots. We validate our system in a series of experiments: in one study, we find an advantage in classification for animated vs static affect expressions and advantages in valence and arousal estimation and personal preference ratings for both animated vs static and physical vs on-screen expressions. In a second experiment, we show that our parametrically generated expression variables correlate with the intended user affect perception. Combining the generative system with a perceptual component of natural language sentiment analysis, we show in a third experiment that our automatically generated affect responses cause participants to show signs of increased engagement and enjoyment compared with arbitrarily chosen comparable motion parameters.
IEEE Robotics & Automation Magazine | 2013
Marcelo Cicconet; Mason Bretan; Gil Weinberg
Visual cues-based anticipation is a fundamental aspect of human-human interaction, and it plays an important role in the time-demanding medium of group music. In this article, we explore the importance of visual gesture anticipation in music performance involving humans and robots. We study the particular case in which a human percussionist is playing a four-piece percussion set, and a robot musician is playing either the marimba or a three-piece percussion set. We use computer vision to embed anticipation in the robotic response to the human gestures.
human-robot interaction | 2012
Marcelo Cicconet; Mason Bretan; Gil Weinberg
Visual cues-based anticipation is a fundamental aspect of human-human interaction, and it plays an especially important role in the time demanding medium of group performance. In this work we explore the importance of visual gesture anticipation in music performance involving human and robot. We study the case in which a human percussionist is playing a four-piece percussion set, and a robot musician is playing either the marimba, or a three-piece percussion set. Computer Vision is used to embed anticipation in the robotic response to the human gestures. We developed two algorithms for anticipation, predicting the strike location about 10 mili-seconds or about 100 mili-seconds before it occurs. Using the second algorithm, we show that the robot outperforms, on average, a group of human subjects, in synchronizing its gesture with a reference strike. We also show that, in the tested group of users, having some time in advance is important for a human to synchronize the strike with a reference player, but, from a certain time, that good influence stops increasing.
Communications of The ACM | 2016
Mason Bretan; Gil Weinberg
arXiv: Sound | 2016
Mason Bretan; Gil Weinberg; Larry P. Heck
new interfaces for musical expression | 2014
Mason Bretan; Gil Weinberg
arXiv: Robotics | 2016
Mason Bretan; Deepak Gopinath; Philip Mullins; Gil Weinberg
international computer music conference | 2012
Mason Bretan; Marcelo Cicconet; Ryan Nikolaidis; Gil Weinberg
national conference on artificial intelligence | 2017
Mason Bretan; Sageev Oore; Jesse Engel; Douglas Eck; Larry P. Heck
national conference on artificial intelligence | 2017
Mason Bretan; Gil Weinberg