Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Victor Zappi is active.

Publication


Featured researches published by Victor Zappi.


2014 IEEE VR Workshop: Sonic Interaction in Virtual Environments (SIVE) | 2014

Scenography of immersive virtual musical instruments

Florent Berthaut; Victor Zappi; Dario Mazzanti

Immersive Virtual Musical Instruments (IVMIs) can be considered as the meeting between Music Technology and Virtual Reality. Being both musical instruments and elements of Virtual Environments, IVMIs require a transversal approach from their designers, in particular when the final aim is to play them in front of an audience, as part of a scenography. In this paper, we combine the main constraints of musical performances and Virtual Reality applications into a set of dimensions, meant to extensively describe IVMIs stage setups. A number of existing stage setups are then classified using these dimensions, explaining how they were used to showcase live virtual performances and discussing their scenographic level.


international conference on haptic and audio interaction design | 2010

Virtual sequencing with a tactile feedback device

Victor Zappi; Marco Gaudina; Andrea Brogni; Darwin G. Caldwell

Since the beginning of Virtual Reality many artistic applications were developed, showing how this technology could be exploited not only from a technical point of view, but also in the field of feelings and emotions. Nowadays music is one of the most interesting field of application for Virtual Reality, and many environments provide the user with means to express her/himself; our work follows this direction, aiming at developing a set of multimodal musical interfaces. In this paper we present a first simple virtual sequencer combined with a low cost tactile feedback device: some preliminary experiments were done to analyze how skilled musicians approach this unusual way of making music.


ieee international workshop on haptic audio visual environments and games | 2010

Distributed multimodal interaction driven framework: Conceptual model and game example

Marco Gaudina; Victor Zappi; Andrea Brogni; Darwin G. Caldwell

Since technology started to be large scale distributed, gaming experience has changed radically. After years of graphical details challenges, the attention has been shifted on how users can interact with games. Users nowadays can play and interact much more than before, considering himself/herself the chief character of the entire game scene with his physical actions. Possibilities have been increased by modern game controllers and multimodal devices. Our work follows this direction, aiming to develop an unifying interactions framework to distribute user interactions. This could enhance user possibilities, feelings and emotions, over different devices with heterogeneous different possibilities. Hence, we present a conceptual model where the type of interaction between users is subordinated not only to the available technology but primarily to their main purpose for that specific interaction. The architecture is applied to a preliminary crossplatform, multiplayer musical game.


Journal of the Acoustical Society of America | 2016

Towards real-time two-dimensional wave propagation for articulatory speech synthesis

Victor Zappi; Arvind Vasudevan; Sidney S. Fels

The precise simulation of voice production is a challenging task, especially when real-time performances are sought. To fulfill real-time constraints, most articulatory vocal synthesizers have to rely on highly simplified acoustic and anatomical models, based on 1D wave propagation and on the usage of vocal tract area functions. In this work, we present a 2D propagation model, designed to simulate the air flow traveling through the midsagittal contour of the vocal tract. Building on the work by Allen et al. [Andrew Allen and Nikunj Raghuvanshi, “Aerophones in flatland: Interactive wave simulation of wind instruments,” ACM Trans. Graph. 34, Article 134 (2015)], we leverage OpenGL and GPU parallelism for a real-time precise 2D airwave simulation. The domain is divided into cells according to a Finite-Difference Time-Domain scheme and coupled with a self-oscillating two-mass vocal fold model. To investigate the system’s ability to simulate the physiology of the vocal tract and its aerodynamics, two studies a...


Eurasip Journal on Audio, Speech, and Music Processing | 2012

Music expression with a robot manipulator used as a bidirectional tangible interface

Victor Zappi; Antonio Pistillo; Sylvain Calinon; Andrea Brogni; Darwin G. Caldwell

The availability of haptic interfaces in music content processing offers interesting possibilities of performer-instrument interaction for musical expression. These new musical instruments can precisely modulate the haptic feedback, and map it to a sonic output, thus offering new artistic content creation possibilities. With this article, we investigate the use of a robotic arm as a bidirectional tangible interface for musical expression, actively modifying the compliant control strategy to create a bind between gestural input and music output. The user can define recursive modulations of music parameters by grasping and gradually refining periodic movements on a gravity-compensated robot manipulator. The robot learns on-line the new desired trajectory, increasing its stiffness as the modulation refinement proceeds. This article reports early results of an artistic performance that has been carried out with the collaboration of a musician, who played with the robot as part of his live stage setup.


symposium on 3d user interfaces | 2010

Passive hand pose recognition in virtual reality

Victor Zappi; Andrea Brogni; Darwin G. Caldwell

In this Poster we present a hand pose recognition interface based on passive tracking: currently four poses can be detected on both hands, using a small marker set, attached on fingers and on the back of the hands. This user interface is designed to support a natural way of interaction with objects in Virtual Reality, permitting users to quickly switch among selection, translation and rotation commands. The interface described here is part of a multimodal platform where sound parameters can be changed without the use of musical instruments or controllers, but through free hand mesh manipulation.


Frontiers in ICT | 2018

Hackable Instruments: Supporting Appropriation and Modification in Digital Musical Interaction

Victor Zappi; Andrew McPherson

This paper investigates the appropriation of digital musical instruments, wherein the performer develops a personal working relationship with an instrument that may differ from the designers intent. Two studies are presented which explore different facets of appropriation. First, a highly restrictive instrument was designed to assess the effects of constraint on unexpected creative use. Second, a digital instrument was created which initially shared several constraints and interaction modalities with the first instrument, but which could be rewired by the performer to discover sounds not directly anticipated by the designers. Each instrument was studied with 10 musicians working individually to prepare public performances on the instrument. The results suggest that constrained musical interactions can promote the discovery of unusual and idiosyncratic playing techniques, and that tighter constraints may paradoxically lead to a richer performer experience. The diversity of ways in which the rewirable instrument was modified and used indicates that its design is open to interpretation by the performer, who may discover interaction modalities that were not anticipated by the designers.


Computer Music Journal | 2018

Extended Playing Techniques on an Augmented Virtual Percussion Instrument

Victor Zappi; Andrew Allen; Sidney S. Fels

Innovation and tradition are two fundamental factors in the design of new digital musical instruments. Although apparently mutually exclusive, novelty does not imply a total disconnection from what we have inherited from hundreds of years of traditional design, and the balance of these two factors often determines the overall quality of an instrument. Inspired by this rationale, in this article we introduce the Hyper Drumhead, a novel augmented virtual instrument whose design is deeply rooted in traditional musical paradigms, yet aimed at the exploration of unprecedented sounds and control. In the first part of the article we analyze the concepts of designing an augmented virtual instrument, explaining their connection with the practice of augmenting traditional instruments. Then we describe the design of the Hyper Drumhead in detail, focusing on its innovative physical modeling implementation. The finite-difference time-domain solver that we use runs on the parallel cores of a commercially available graphics card and permits the simulation of real-time 2-D wave propagation in massively sized domains. Thanks to the modularity of this implementation, musicians can create several 2-D virtual percussive instruments that support realistic playing techniques but whose affordances can be enhanced beyond most of the limits of traditional augmentation.


virtual systems and multimedia | 2012

Point clouds indexing in real time motion capture

Dario Mazzanti; Victor Zappi; Andrea Brogni; Darwin G. Caldwell

Todays human-computer interaction techniques are often gesture-inspired and thus pushed towards naturalness and immediateness. Their implementation requires non-invasive tracking systems which can work with little or no body attached devices, like wireless optical motion capture. These technologies present a recurrent problem, which is to keep a coherent indexing for the different captured points during real time tracking. The inability to constantly distinguish tracked points limits interaction naturalness and design possibilities. In this paper we present a real time algorithm capable of dealing with points indexing matter. Compared to other solutions, the presented research adds a computed indexing correction to keep coherent indexing throughout the tracking session. The correction is applied automatically by the system, whenever a specific configuration is detected. Our solution works with an arbitrary number of points and it was primarily designed for fingertips tracking. A Virtual Reality application was developed in order to exploit the algorithm functionalities while testing its behavior and effectiveness. The application provides a virtual stereoscopic, user-centric environment in which the user can trigger simple interactions by reaching virtual objects with his/her fingertips.


Journal of The Audio Engineering Society | 2015

An Environment for Submillisecond-Latency Audio and Sensor Processing on BeagleBone Black

Andrew McPherson; Victor Zappi

Collaboration


Dive into the Victor Zappi's collaboration.

Top Co-Authors

Avatar

Andrea Brogni

Istituto Italiano di Tecnologia

View shared research outputs
Top Co-Authors

Avatar

Darwin G. Caldwell

Istituto Italiano di Tecnologia

View shared research outputs
Top Co-Authors

Avatar

Dario Mazzanti

Istituto Italiano di Tecnologia

View shared research outputs
Top Co-Authors

Avatar

Sidney S. Fels

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Andrew McPherson

Queen Mary University of London

View shared research outputs
Top Co-Authors

Avatar

Marco Gaudina

Istituto Italiano di Tecnologia

View shared research outputs
Top Co-Authors

Avatar

Antonio Pistillo

Istituto Italiano di Tecnologia

View shared research outputs
Top Co-Authors

Avatar

Peter Anderson

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge