Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Xavier Amatriain is active.

Publication


Featured researches published by Xavier Amatriain.


IEEE MultiMedia | 2009

The AlloSphere: Immersive Multimedia for Scientific Discovery and Artistic Exploration

Xavier Amatriain; JoAnn Kuchera-Morin; Tobias Höllerer; Stephen Travis Pope

The AlloSphere is a spherical space in which immersive, virtual environments allow users to explore large-scale data sets through multimodal, interactive media.


international conference on computer graphics and interactive techniques | 2007

The allosphere: a large-scale immersive surround-view instrument

Tobias Höllerer; JoAnn Kuchera-Morin; Xavier Amatriain

We present the design of the Allosphere and initial experiences from its ongoing implementation. The UCSB Allosphere is a novel large-scale instrument for immersive visualization and simulation, which in its full realization will be one of the worlds largest immersive environments. The three-story high cubical space comprises an anechoic chamber with a spherical display screen, ten meters in diameter, surrounding from one to thirty users standing on a bridge structure. The Allosphere is differentiated from conventional virtual reality environments by its size and focus on collaborative experiences, its seamless surround-view capabilities and its focus on multiple sensory modalities and interaction. The Allosphere is being equipped with high-resolution active stereo projectors, a complete 3D sound system with hundreds of speakers, and interaction technology. In this paper we will give an overview of the purpose of the instrument as well as the systems that are being put into place in order to equip it. We also review the first results and experiences in developing and using the Allosphere in several prototype projects.


IEEE Software | 2007

CLAM: A Framework for Audio and Music Application Development

Xavier Amatriain

The CLAM (C++ Library for Audio and Music) development framework offers a complete R&D platform for the audio and music domain. Winner of the 2006 ACM Open Source Multimedia Software Competition, CLAM originated in an effort to organize a repository of audio-processing algorithms. Today it includes an abstract model for audio systems, a repository of processing algorithms and data types, and several tools and stand-alone applications. Developers can exploit all these features to build cross-platform applications or rapid prototypes for testing signal- and media-processing algorithms and systems


acm multimedia | 2006

CLAM: a framework for efficient and rapid development of cross-platform audio applications

Xavier Amatriain; Pau Arumí; David Garcia

CLAM is a C++framework that offers a complete development and research platform for the audio and music domain. Apart from offering an abstract model for audio systems, it also includes a repository of processing algorithms and data types as well as a number of tools such as audio or MIDI input/output. All these features can be exploited to build cross-platform applications or to build rapid prototypes to test signal and media processing algorithms and systems. The framework also includes a number of stand-alone applications that can be used for tasks such as audio analysis/synthesis, plug-in development or metadata annotation.In this article we give a brief overview of CLAMs features and applications.


Multimedia Systems | 2008

A framework for efficient and rapid development of cross-platform audio applications

Xavier Amatriain; Pau Arumí; David Garcia

In this article, we present CLAM, a C++ software framework, that offers a complete development and research platform for the audio and music domain. It offers an abstract model for audio systems and includes a repository of processing algorithms and data types as well as all the necessary tools for audio and control input/output. The framework offers tools that enable the exploitation of all these features to easily build cross-platform applications or rapid prototypes for media processing algorithms and systems. Furthermore, included ready-to-use applications can be used for tasks such as audio analysis/synthesis, plug-in development, feature extraction or metadata annotation. CLAM represents a step forward over other similar existing environments in the multimedia domain. Nevertheless, it also shares models and constructs with many of those. These commonalities are expressed in the form of a metamodel for multimedia processing systems and a design pattern language.


IEEE Transactions on Multimedia | 2007

A Domain-Specific Metamodel for Multimedia Processing Systems

Xavier Amatriain

In this paper, we introduce 4MPS, a metamodel for multimedia processing systems. The goal of 4MPS is to offer a generic system metamodel that can be instantiated to describe any multimedia processing design. The metamodel combines the advantages of the object-oriented paradigm and metamodeling techniques with system engineering principles and graphical models of computation. 4MPS is based on the classification of multimedia processing objects into two main categories: Processing objects that operate on data and controls, and Data objects that passively hold media content. Processing objects encapsulate a method or algorithm. They also include support for synchronous data processing and asynchronous event-driven Controls as well as a configuration mechanism and an explicit life cycle state model. Data input to and output from Processing objects is done through Ports. Data objects offer a homogeneous interface to media data, and support for metaobject-like facilities such as reflection and serialization. The metamodel can be expressed in the language of graphical models of computation such as the Dataflow Networks and presents a comprehensive conceptual framework for media signal processing applications. 4MPS has its practical validation in several existing environments, including the authors CLAM framework.


computer music modeling and retrieval | 2008

Experiencing Audio and Music in a Fully Immersive Environment

Xavier Amatriain; Jorge Castellanos; Tobias Höllerer; JoAnn Kuchera-Morin; Stephen Travis Pope; Graham Wakefield; Will Wolcott

The UCSB Allosphere is a 3-story-high spherical instrument in which virtual environments and performances can be experienced in full immersion. The space is now being equipped with high-resolution active stereo projectors, a 3D sound system with several hundred speakers, and with tracking and interaction mechanisms. The Allosphere is at the same time multimodal, multimedia, multi-user, immersive, and interactive. This novel and unique instrument will be used for research into scientific visualization/auralization and data exploration, and as a research environment for behavioral and cognitive scientists. It will also serve as a research and performance space for artists exploring new forms of art. In particular, the Allosphere has been carefully designed to allow for immersive music and aural applications. In this paper, we give an overview of the instrument, focusing on the audio subsystem. We give the rationale behind some of the design decisions and explain the different techniques employed in making the Allosphere a truly general-purpose immersive audiovisual lab and stage. Finally, we present first results and our experiences in developing and using the Allosphere in several prototype projects.


international symposium/conference on music information retrieval | 2005

The CLAM Annotator: A Cross-platform Audio Descriptors Editing Tool

Xavier Amatriain; Jordi Massaguer; David Garcia; Ismael Mosquera


international computer music conference | 2005

DEVELOPING CROSS-PLATFORM AUDIO AND MUSIC APPLICATIONS WITH THE CLAM FRAMEWORK

Xavier Amatriain; Pau Arumí


international computer music conference | 2007

IMMERSIVE AUDIO AND MUSIC IN THE ALLOSPHERE

Xavier Amatriain; Tobias Höllerer; JoAnn Kuchera-Morin; Stephen Travis Pope

Collaboration


Dive into the Xavier Amatriain's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David Garcia

Pompeu Fabra University

View shared research outputs
Top Co-Authors

Avatar

Pau Arumí

Pompeu Fabra University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ryan Avery

University of California

View shared research outputs
Top Co-Authors

Avatar

Will Wolcott

University of California

View shared research outputs
Researchain Logo
Decentralizing Knowledge