Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Zack Settel is active.

Publication


Featured researches published by Zack Settel.


Contemporary Music Review | 1994

Real-time timbral transformation: FFT-based resynthesis

Zack Settel; Cort Lippe

This paper presents real-time musical applications using the IRCAM Signal Processing Workstation which make use of FFT/IFFT-based resynthesis for timbral transformation in a compositional context. An intuitive and straightforward user interface, intended for use by musicians, has been developed by the authors in the Max programming environment. Techniques for filtering, cross-synthesis, noise reduction, and dynamic spectral shaping are presented along with control structures that allow for both fine timbral modification and control of complex sound transformations using few parameters.


workshop on applications of signal processing to audio and acoustics | 1995

Real-time musical applications using frequency domain signal processing

Zack Settel; Cort Lippe

This paper presents real-time musical applications using the IRCAM signal processing workstation which make use of FFT/IFFT-based resynthesis for timbral transformation in a musical context. An intuitive and straightforward user interface, intended for use by musicians, has been developed by the authors in the Max programming environment. Techniques for high quality time-stretching, filtering, cross-synthesis, dynamic range processing, and spectrum shaping are presented along with dynamic control structures that allow for both fine timbral modification and control of complex sound transformations using few parameters.


workshop on applications of signal processing to audio and acoustics | 1999

Low-dimensional audio-rate control of FFT-based processing

Cort Lippe; Zack Settel

While the use of the fast Fourier transform (FFT) for signal processing in music applications has been widespread, applications in real-time systems for dynamic spectral transformation have been quite limited. The limitations have been largely due to the amount of computation required for the operations. With faster machines, and with suitable implementation for frequency-domain processing, real-time dynamic control of high-quality spectral processing can be accomplished with great efficiency and a simple approach. This paper describe some previous work in dynamic real-time control of frequency-domain-based signal processing. Since the implementation of the FFT/IFFT is central to the approach and methods discussed, the authors provide a description of this implementation, as well as of the development environment used in their work.


new interfaces for musical expression | 2007

Ménagerie imaginaire

Zack Settel; Mike Wozniewski; Jean-Michel Dumas

Our work leading to this piece began with the search for a better way to interact with electronic sound during performance. Taking the current trend towards performing on stage with laptop computers, we have conceived of a radically different arrangement where performer motion is not only unrestricted, but actually serves a principal role in the performance interface. We abandon conventional electronic audio devices, and use physically modelled virtual simulation as the performance medium. This allows for the use of natural gestures to control sophisticated audio processing, while still allowing traditional musical instruments to be played. In our approach, motions and sounds are captured and modelled within a virtual 3D environment. A combination of microphones and sensors allows multiple performers to input their audio into the scene and steer their signal with great precision, allowing for interaction with digital audio effects located throughout the virtual world. For example, when a performer wishes to send sound through a reverberator, they simply point their instrument toward the reverb unit in the 3D space. The performers may also travel different sonic regions that contain varying types of musical accompaniment and effects. The scene is rendered graphically in realtime, allowing the audience to watch the performance on a large screen situated above the performers on stage. The audience viewpoint provides a subjective rendering, with proper spatialization of all virtual audio sources. Live video from webcams provide textures which are mapped onto 3D avatars, allowing the audience to see close-up views of the performers.


international computer music conference | 1987

Control of Phrasing and Articulation in Synthesis

David Wessel; David Bristow; Zack Settel


Journal of The Audio Engineering Society | 2000

Real-Time Streaming of Multichannel Audio Data over Internet

Aoxiang Xu; Wieslaw Woszczyk; Zack Settel; Bruce W. Pennycook; Robert Rowe; Philip Galanter; Jeffrey Bary; Geoff Martin; Jason Corey; Jeremy R. Cooperstock


international computer music conference | 1993

Nonobvious roles for electronics in performance enhancement.

Miller Puckette; Zack Settel


new interfaces for musical expression | 2008

Large-Scale Mobile Audio Environments for Collaborative Musical Interaction

Michael Wozniewski; Nicolas Bouillot; Zack Settel; Jeremy R. Cooperstock


new interfaces for musical expression | 2006

A framework for immersive spatial audio performance

Michael Wozniewski; Zack Settel; Jeremy R. Cooperstock


new interfaces for musical expression | 2012

A Survey and Thematic Analysis Approach as Input to the Design of Mobile Music GUIs.

Atau Tanaka; Adam Parkinson; Zack Settel; Koray Tahiroglu

Collaboration


Dive into the Zack Settel's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge