Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Graham Wakefield is active.

Publication


Featured researches published by Graham Wakefield.


International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 2009

The Allobrain: An interactive, stereographic, 3D audio, immersive virtual world

John N. Thompson; JoAnn Kuchera-Morin; Marcos Novak; Daniel Overholt; Lance Jonathan Putnam; Graham Wakefield; Wesley Smith

This paper describes the creation of the Allobrain project, an interactive, stereographic, 3D audio, immersive virtual world constructed from fMRI brain data and installed in the Allosphere, one of the largest virtual reality spaces in existence. This paper portrays the role the Allobrain project played as an artwork driving the technological infrastructure of the Allosphere. The construction of the Cosm toolkit software for prototyping the Allobrain and other interactive, stereographic, 3D audio, immersive virtual worlds in the Allosphere is described in detail. Aesthetic considerations of the Allobrain project are discussed in relation to world-making as a means to understand and explore large data sets.


evoworkshops on applications of evolutionary computing | 2009

Artificial Nature: Immersive World Making

Graham Wakefield; Haru (Hyunkyung) Ji

Artificial Nature is a trans-disciplinary research project drawing upon bio-inspired system theories in the production of engaging immersive worlds as art installations. Embodied world making and immersion are identified as key components in an exploration of creative ecosystems toward art-as-it-could-be. A detailed account of the design of a successfully exhibited creative ecosystem is given in these terms, and open questions are outlined.


computer music modeling and retrieval | 2008

Experiencing Audio and Music in a Fully Immersive Environment

Xavier Amatriain; Jorge Castellanos; Tobias Höllerer; JoAnn Kuchera-Morin; Stephen Travis Pope; Graham Wakefield; Will Wolcott

The UCSB Allosphere is a 3-story-high spherical instrument in which virtual environments and performances can be experienced in full immersion. The space is now being equipped with high-resolution active stereo projectors, a 3D sound system with several hundred speakers, and with tracking and interaction mechanisms. The Allosphere is at the same time multimodal, multimedia, multi-user, immersive, and interactive. This novel and unique instrument will be used for research into scientific visualization/auralization and data exploration, and as a research environment for behavioral and cognitive scientists. It will also serve as a research and performance space for artists exploring new forms of art. In particular, the Allosphere has been carefully designed to allow for immersive music and aural applications. In this paper, we give an overview of the instrument, focusing on the audio subsystem. We give the rationale behind some of the design decisions and explain the different techniques employed in making the Allosphere a truly general-purpose immersive audiovisual lab and stage. Finally, we present first results and our experiences in developing and using the Allosphere in several prototype projects.


Archive | 2008

Computational Audiovisual Composition Using Lua

Wesley Smith; Graham Wakefield

We describe extensions to the Lua programming language constituting a novel platform to support practice and investigation in computational audiovisual composition. Significantly, these extensions enable the tight real-time integration of computation, time, sound and space, and follow a modus operandi of development going back to immanent properties of the domain.


Computer Music Journal | 2015

Designing musical instruments for the browser

Charles Roberts; Graham Wakefield; Matthew Wright; JoAnn Kuchera-Morin

Native Web technologies provide great potential for musical expression. We introduce two JavaScript libraries towards this end: Gibberish.js, providing heavily optimized audio DSP, and Interface.js, a GUI toolkit that works with mouse, touch, and motion events. Together they provide a complete system for defining musical instruments that can be used in both desktop and mobile Web browsers. Interface.js also enables control of remote synthesis applications via a server application that translates the socket protocol used by Web interfaces into both MIDI and OSC messages. We have incorporated these libraries into the creative coding environment Gibber, where we provide mapping abstractions that enable users to create digital musical instruments in as little as a single line of code. They can then be published to a central database, enabling new instruments to be created, distributed, and run entirely in the browser.


the 4th International Conference on Evolutionary and Biologically Inspired Music, Sound, Art and Design | 2015

Toward Certain Sonic Properties of an Audio Feedback System by Evolutionary Control of Second-Order Structures

Seunghun Kim; Juhan Nam; Graham Wakefield

Aiming for high-level intentional control of audio feedback, though microphones, loudspeakers and digital signal processing, we present a system adapting toward chosen sonic features. Users control the system by selecting and changing feature objectives in real-time. The system has a second-order structure in which the internal signal processing algorithms are developed according to an evolutionary process. Genotypes develop into signal-processing algorithms, and fitness is measured by analysis of the incoming audio feedback. A prototype is evaluated experimentally to measure changes of audio feedback depending on the chosen target conditions. By enhancing interactivity of an audio feedback through the intentional control, we expect that feedback systems could be utilized more effectively in the fields of musical interaction, finding balance between nonlinearity and interactivity.


IEEE Computer Graphics and Applications | 2013

Spatial Interaction in a Multiuser Immersive Instrument

Graham Wakefield; Tobias Höllerer; JoAnn Kuchera-Morin; Charles Roberts; Matthew Wright

The AlloSphere provides multiuser spatial interaction through a curved surround screen and surround sound. Two projects illustrate how researchers employed the AlloSphere to investigate the combined use of personal-device displays and the shared display. Another two projects combined multiuser interaction with multiagent systems. These projects point to directions for future ensemble-style collaborative interaction.


Proceedings of the 4th International Conference on Movement Computing | 2017

Incorporating Kinesthetic Creativity and Gestural Play into Immersive Modeling

Sung-A Jang; Graham Wakefield; Sung-Hee Lee

The 3D modeling methods and approach presented in this paper attempt to bring the richness and spontaneity of human kinesthetic interaction in the physical world to the process of shaping digital form, by exploring playfully creative interaction techniques that augment gestural movement. The principal contribution of our research is a novel dynamics-driven approach for immersive freeform modeling, which extends our physical reach and supports new forms of expression. In this paper we examine three augmentations of freehand 3D interaction that are inspired by the dynamics of physical phenomena. These are experienced via immersive augmented reality to intensify the virtual physicality and heighten the sense of creative empowerment.


Archive | 2017

2013: The Web Browser as Synthesizer and Interface

Charles Roberts; Graham Wakefield; Matthew Wright

Our research examines the use and potential of native web technologies for musical expression. We introduce two JavaScript libraries towards this end: Gibberish.js, a heavily optimized audio DSP library, and Interface.js, a GUI toolkit that works with mouse, touch and motion events. Together these libraries provide a complete system for defining musical instruments that can be used in both desktop and mobile web browsers. Interface.js also enables control of remote synthesis applications via a server application that translates the socket protocol used by web interfaces into both MIDI and OSC messages.


Applied Sciences | 2016

Augmenting Environmental Interaction in Audio Feedback Systems

Seunghun Kim; Graham Wakefield; Juhan Nam

Audio feedback is defined as a positive feedback of acoustic signals where an audio input and output form a loop, and may be utilized artistically. This article presents new context-based controls over audio feedback, leading to the generation of desired sonic behaviors by enriching the influence of existing acoustic information such as room response and ambient noise. This ecological approach to audio feedback emphasizes mutual sonic interaction between signal processing and the acoustic environment. Mappings from analyses of the received signal to signal-processing parameters are designed to emphasize this specificity as an aesthetic goal. Our feedback system presents four types of mappings: approximate analyses of room reverberation to tempo-scale characteristics, ambient noise to amplitude and two different approximations of resonances to timbre. These mappings are validated computationally and evaluated experimentally in different acoustic conditions.

Collaboration


Dive into the Graham Wakefield's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Charles Roberts

Rochester Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Wesley Smith

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge