Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dean Rubine is active.

Publication


Featured researches published by Dean Rubine.


international conference on computer graphics and interactive techniques | 1991

Specifying gestures by example

Dean Rubine

Gesture-Based interfaces offer an alternative to traditional keyboard, menu, and direct manipulation interfaces. The ability to specify objects, an operation, and additional parameters with a single intuitive gesture appeals to both novice and experienced users. Unfortunately, gesture-based interfaces have not been extensively researched, partly because they are difficult to create. This paper describes GRANDMA, a toolkit for rapidly adding gestures to direct manipulation interfaces. The trainable single-stroke gesture recognizer used by GRANDMA is also described.


human factors in computing systems | 1992

Combining gestures and direct manipulation

Dean Rubine

A gesture, as the term is used here, is a handmade mark used to give a command to a computer. The attributes of the gesture (its location, size, extent, orientation, and dynamic properties) can be mapped to parameters of the command. An operation, operands, and parameters can all be communicated simultaneously with a single, intuitive, easily drawn gesture. This makes gesturing an attractive interaction teehnique. ~pically, agestural interactions completed (e.g. the styIus is lifted) before the the gesture is classified, its attributes computed, and the intended command performed. There is no opportunity for the interactive manipulation of parameters in the presence of application feedback that is typical of drag operations indirect manipulation interfaces. This lack of continuous feedback during the interaction makes the use of gestures awkward for tasks that require such feedback, The video presents a two-phase interaction technique that combines gesture and direct manipulation. A two-phase interaction begins with a gesture, which is recognized during the interaction (e.g. while the stylus is still touching the writing surface). After recognition, the application is informed and the interaction continues, allowing the user to manipulate parameters interactively, The result is a powerful interaction which combines the advantages of gesturing and direct manipulation.


Computer Music Journal | 1990

Programmable Finger-Tracking Instrument Controllers

Dean Rubine; Paul McAvinney

Traditionally, the controls of a musical instrument have been mechanically connected to its sound generators. It is well known that electrical couplings may be used instead of mechanical ones, and that digital instead of analog connections may be used. The freedom of not having the controls mechanically constrained to the means of sound production brings a corresponding burden. What form should the controls now take? Knowledge of the practice of music, psychoacoustics, ergonomics, sensing technologies, design, computer science, and economics can all be brought to bear on the question; common sense and personal taste also play a large role. Our interest in instrument control has led us to


Multimedia Systems | 1993

Tactus: toolkit-level support for synchronized interactive multimedia

Roger B. Dannenberg; Thomas P. Neuendorffer; Joseph M. Newcomer; Dean Rubine; David B. Anderson

Tactus addresses problems of synchronizing and controlling various interactive continuous-time media. The Tactus system consists of two main parts. The first is a server that synchronizes the presentation of multiple media, including audio, video, graphics, and MIDI at a workstation. The second is a set of extensions to a graphical user interface toolkit to help compute and/or control temporal streams of information and deliver them to the Tactus Server. Temporal toolkit objects schedule computation events that generate media. Computation is scheduled in advance of real time to overcome system latency, and timestamps are used to allow accurate synchronization by the server in spite of computation and transmission delays. Tactus supports precomputing branches of media streams to minimize latency in interactive applications.


Computer Music Journal | 1986

Arctic: A Functional Language for Real-Time Systems

Roger B. Dannenberg; Paul McAvinney; Dean Rubine

In the past, real-time control via digital computer has been achieved more through ad hoc techniques than through a formal theory. Languages for realtime control have emphasized concurrency, access to hardware input/output (I/O) devices, interrupts, and mechanisms for scheduling tasks, rather than taking a high-level problem-oriented approach in which implementation details are hidden. In this paper, we present an alternative approach to realtime control that enables the programmer to express the real-time response of a system in a declarative fashion rather than an imperative or procedural one. Examples of traditional, sequential languages for real-time control include Modula (Wirth 1977a; 1977b; 1982), Ada (DOD 1980), CSP (Hoare 1978), and OCCAM (May 1983). These languages all provide support for concurrency through multiple sequential threads of control. Programmers must work hard to make, sure their processes execute the right instructions at the appropriate times, and realtime control is regarded as the most difficult form of programming (Glass 1980). In contrast, our approach (Dannenberg 1984; 1986) is based on a nonsequential model in which behavior in the time domain is specified explicitly. This model describes possible system responses to real-time conditions and provides a means for manipulating and composing responses. The programming language Arctic is based on the nonsequential model and was designed for use in real-time computer music programs. It should be emphasized that our efforts have concentrated on the development of a notation for specifying desired real-time behavior. Any implementation only approximates the desired behavior, just as Arctic: A Functional Language for Real-Time Systems


network and operating system support for digital audio and video | 1992

Tactus: Toolkit-Level Support for Synchronized Interactive Multimedia

Roger B. Dannenberg; Thomas P. Neuendorffer; Joseph M. Newcomer; Dean Rubine

Tactus addresses problems of synchronizing and controlling various interactive continuous-time media. The Tactus system consists of two main parts. The first is a server that synchronizes the presentation of multiple media, including audio, video, graphics, and MIDI, at a workstation. The second is a set of extensions to a graphical user interface toolkit to help compute and/or control temporal streams of information and deliver them to the Tactus Server. Temporal toolkit objects schedule computation events that generate media. Computation is scheduled in advance of real time to overcome system latency, and timestamps are used to allow accurate synchronization by the server in spite of computation and transmission delays. Tactus supports precomputing branches of media streams to minimize latency in interactive applications.


Contemporary Music Review | 1991

The videoharp: an optical scanning MIDI controller

Dean Rubine; Paul McAvinney

The videoharp is an optical-scanning musical instrument controller which senses multiple-finger gestures. It works by detecting the images of a performers fingertips. From the images, the position and velocity of each fingertip is deduced. This information is translated into MIDI, which is then used to drive synthesizers for sound production. The translation is programmable, enabling the instrument to be played using different articulation techniques. The videoharp may be played with harp-like or keyboard-like gestures, by bowing or drumming motions, or by motions with no analogue in existing instrument technique.


Archive | 1992

The automatic recognition of gestures

Dean Rubine


Journal of The Audio Engineering Society | 1990

Analysis and Synthesis of Tones by Spectral Interpolation

Marie-Hélène Serra; Dean Rubine; Roger B. Dannenberg


USENIX Summer | 1991

Integrating Gesture Recognition and Direct Manipulation.

Dean Rubine

Collaboration


Dive into the Dean Rubine's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

David B. Anderson

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Paul McAvinney

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tom Neuendorffer

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Jim Zelenka

Carnegie Mellon University

View shared research outputs
Researchain Logo
Decentralizing Knowledge