Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David Merrill is active.

Publication


Featured researches published by David Merrill.


tangible and embedded interaction | 2007

Siftables: towards sensor network user interfaces

David Merrill; Jeevan James Kalanithi; Pattie Maes

This paper outlines Siftables, a novel platform that applies technology and methodology from wireless sensor networks to tangible user interfaces in order to yield new possibilities for human-computer interaction. Siftables are compact devices with sensing, graphical display, and wireless communication. They can be physically manipulated as a group to interact with digital information and media. We discuss the unique affordances that a sensor network user interface (SNUI) such as Siftables provides, as well as the resulting directness between the physical interface and the data being manipulated. We conclude with a description of some gestural language primitives that we are currently prototyping with Siftables.


human factors in computing systems | 2012

Sifteo cubes

David Merrill; Emily Sun; Jeevan James Kalanithi

In this paper we describe Sifteo cubes™, a tangible and graphical user interface platform. We note several patterns of use observed in homes and schools and identify design recommendations for display utilization on distributed interfaces like Sifteo cubes. Additionally we discuss the process of commercializing the research prototype to create a marketable game system.


ubiquitous computing | 2010

Identifying and facilitating social interaction with a wearable wireless sensor network

Joseph A. Paradiso; Jonathan Gips; Mathew Laibowitz; Sajid Sadi; David Merrill; Ryan Aylward; Pattie Maes; Alex Pentland

We have designed a highly versatile badge system to facilitate a variety of interaction at large professional or social events and serve as a platform for conducting research into human dynamics. The badges are equipped with a large LED display, wireless infrared and radio frequency networking, and a host of sensors to collect data that we have used to develop features and algorithms aimed at classifying and predicting individual and group behavior. This paper overviews our badge system, describes the interactions and capabilities that it enabled for the wearers, and presents data collected over several large deployments. This data is analyzed to track and socially classify the attendees, predict their interest in other people and demonstration installations, profile the restlessness of a crowd in an auditorium, and otherwise track the evolution and dynamics of the events at which the badges were run.


interaction design and children | 2010

Make a Riddle and TeleStory: designing children's applications for the siftables platform

Seth E. Hunter; Jeevan James Kalanithi; David Merrill

We present the design of Make a Riddle and TeleStory, educational applications developed on the Siftables platform for children aged 4-7 years. Siftables are hybrid tangible-graphical user interface devices with motion and neighbor sensing, graphical display, and wireless communication. Siftables provide a unique opportunity to give children responsive feedback about the movement and arrangement of a distributed set of objects. We contrast the use case that includes an external display to their use as a standalone application platform. We outline design strategies for communicating information about the affordances of the Siftables and methods of providing dynamic feedback to encourage manipulation and to increase engagement during application use for hybrid tangible-graphical user interfaces.


human factors in computing systems | 2008

The sound of touch: physical manipulation of digital sound

David Merrill; Hayes Solos Raffle; Roberto Aimi

The Sound of Touch is a new tool for real-time capture and sensitive physical stimulation of sound samples using digital convolution. Our hand-held wand can be used to (1) record sound, then (2) play back the recording by brushing, scraping, striking or otherwise physically manipulating the wand against physical objects. During playback, the recorded sound is continuously filtered by the acoustic interaction of the wand and the material being touched. The Sound of Touch enables a physical and continuous sculpting of sound that is typical of acoustic musical instruments and interactions with natural objects and materials, but not available in GUI-based tools or most electronic music instruments. This paper reports the design of the system and observations of thousands of users interacting with it in an exhibition format. Preliminary user feedback suggests future applications to foley, professional sound design, and musical performance.


new interfaces for musical expression | 2007

A unified toolkit for accessing human interface devices in pure data and Max/MSP

Hans-Christoph Steiner; David Merrill; Olaf Matthes

In this paper we discuss our progress on the HID toolkit, a collection of software modules for the Pure Data and Max/MSP programming environments that provide unified, user-friendly and cross-platform access to human interface devices (HIDs) such as joysticks, digitizer tablets, and stomp-pads. These HIDs are ubiquitous, inexpensive and capable of sensing a wide range of human gesture, making them appealing interfaces for interactive media control. However, it is difficult to utilize many of these devices for custom-made applications, particularly for novices. The modules we discuss in this paper are [hidio] 1, which handles incoming and outgoing data between a patch and a HID, and [input_noticer], which monitors HID plug/unplug events. The goal in creating these modules is to preserve maximal flexibility in accessing the input and output capabilities of HIDs, in a manner that is approachable for both sophisticated and beginning designers. This paper documents our design notes and implementation considerations, current progress, and ideas for future extensions to the HID toolkit.


hawaii international conference on system sciences | 2009

Optimizing Visual Feature Perception for an Automatic Wearable Speech Supplement in Face-to-Face Communication and Classroom Situations

Dominic W. Massaro; David Merrill

Given the limitation of hearing and understanding speech for many individuals, we plan to supplement the sound of speech and speechreading with an additional informative visual input. Acoustic characteristics of the speech will be transformed into readily perceivable visual characteristics. The goal is to design a device seamlessly worn by the listener, which will perform continuous real-time acoustic analysis of his or her interlocutor’s speech. This device would transform several continuous acoustic features of the talker’s speech into continuous visual features, which will be simultaneously displayed on the speechreader’s eyeglasses. The current research evaluates how easily a number of different visual configurations are learned and perceived. The goal is to optimize the visual feature presentation and implement it in the wearable computer system.


international conference on multimodal interfaces | 2008

IGlasses: an automatic wearable speech supplementin face-to-face communication and classroom situations

Dominic W. Massaro; Miguel Á. Carreira-Perpiñán; David Merrill; Cass Sterling; Stephanie Bigler; Elise Piazza; Marcus Perlman

The need for language aids is pervasive in todays world. There are millions of individuals who have language and speech challenges, and these individuals require additional support for communication and language learning. We demonstrate technology to supplement common face-to-face language interaction to enhance intelligibility, understanding, and communication, particularly for those with hearing impairments. Our research is investigating how to automatically supplement talking faces with information that is ordinarily conveyed by auditory means. This research consists of two areas of inquiry: 1) developing a neural network to perform real-time analysis of selected acoustic features for visual display, and 2) determining how quickly participants can learn to use these selected cues and how much they benefit from them when combined with speechreading.


international conference on computer graphics and interactive techniques | 2007

The sound of touch

David Merrill; Hayes Solos Raffle

All people have experienced hearing sounds produced when they touch and manipulate different materials. We know what it will sound like to bang our fist against a wooden door, or to crumple a piece of newspaper. We can imagine what a coffee mug will sound like if it is dropped onto a concrete floor. But our wealth of experience handling physical materials does not typically produce much intuition for operating a new electronic instrument, given the inherently arbitrary mapping from gesture to sound.


international conference on computer graphics and interactive techniques | 2007

The sound of touch: a wand and texture kit for sonic exploration

David Merrill; Hayes Solos Raffle

In this paper we describe the Sound of Touch, a new instrument for real-time capture and sensitive physical stimulation of sound samples using digital convolution. Our hand-held wand can be used to (1) record sound, then (2) brush, scrape, strike or otherwise physically manipulate this sound against physical objects. These actions produce sound in a manner that leverages peoples existing intuitions about sonic properties of physical materials. The Sound of Touch permits real-time exploitation of the sonic properties of a physical environment, to achieve a rich and expressive control of digital sound that is not typically possible in electronic sound synthesis and control systems.

Collaboration


Dive into the David Merrill's collaboration.

Top Co-Authors

Avatar

Jeevan James Kalanithi

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Pattie Maes

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joseph A. Paradiso

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Alex Pentland

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Benjamin Vigoda

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Cass Sterling

University of California

View shared research outputs
Top Co-Authors

Avatar

Elise Piazza

University of California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge