Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ravin Balakrishnan is active.

Publication


Featured researches published by Ravin Balakrishnan.


user interface software and technology | 2003

Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays

Mike Wu; Ravin Balakrishnan

Recent advances in sensing technology have enabled a new generation of tabletop displays that can sense multiple points of input from several users simultaneously. However, apart from a few demonstration techniques [17], current user interfaces do not take advantage of this increased input bandwidth. We present a variety of multifinger and whole hand gestural interaction techniques for these displays that leverage and extend the types of actions that people perform when interacting on real physical tabletops. Apart from gestural input techniques, we also explore interaction and visualization techniques for supporting shared spaces, awareness, and privacy. These techniques are demonstrated within a prototype room furniture layout application, called RoomPlanner.


user interface software and technology | 2004

Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users

Daniel Vogel; Ravin Balakrishnan

We develop design principles and an interaction framework for sharable, interactive public ambient displays that support the transition from implicit to explicit interaction with both public and personal information. A prototype system implementation that embodies these design principles is described. We use novel display and interaction techniques such as simple hand gestures and touch screen input for explicit interaction and contextual body orientation and position cues for implicit interaction. Techniques are presented for subtle notification, self-revealing help, privacy controls, and shared use by multiple people each in their own context. Initial user feedback is also presented, and future directions discussed.


user interface software and technology | 2005

Distant freehand pointing and clicking on very large, high resolution displays

Daniel Vogel; Ravin Balakrishnan

We explore the design space of freehand pointing and clicking interaction with very large high resolution displays from a distance. Three techniques for gestural pointing and two for clicking are developed and evaluated. In addition, we present subtle auditory and visual feedback techniques to compensate for the lack of kinesthetic feedback in freehand interaction, and to promote learning and use of appropriate postures.


human factors in computing systems | 2007

Direct-touch vs. mouse input for tabletop displays

Clifton Forlines; Daniel Wigdor; Chia Shen; Ravin Balakrishnan

We investigate the differences -- in terms of bothquantitative performance and subjective preference -- between direct-touch and mouse input for unimanual andbimanual tasks on tabletop displays. The results of twoexperiments show that for bimanual tasks performed ontabletops, users benefit from direct-touch input. However,our results also indicate that mouse input may be moreappropriate for a single user working on tabletop tasksrequiring only single-point interaction.


human factors in computing systems | 2006

Keepin' it real: pushing the desktop metaphor with physics, piles and the pen

Anand Agarawala; Ravin Balakrishnan

We explore making virtual desktops behave in a more physically realistic manner by adding physics simulation and using piling instead of filing as the fundamental organizational structure. Objects can be casually dragged and tossed around, influenced by physical characteristics such as friction and mass, much like we would manipulate lightweight objects in the real world. We present a prototype, called BumpTop, that coherently integrates a variety of interaction and visualization techniques optimized for pen input we have developed to support this new style of desktop organization.


human factors in computing systems | 2006

Hover widgets: using the tracking state to extend the capabilities of pen-operated devices

Tovi Grossman; Ken Hinckley; Patrick Baudisch; Maneesh Agrawala; Ravin Balakrishnan

We present Hover Widgets, a new technique for increasing the capabilities of pen-based interfaces. Hover Widgets are implemented by using the pen movements above the display surface, in the tracking state. Short gestures while hovering, followed by a pen down, access the Hover Widgets, which can be used to activate localized interface widgets. By using the tracking state movements, Hover Widgets create a new command layer which is clearly distinct from the input layer of a pen interface. In a formal experiment Hover Widgets were found to be faster than a more traditional command activation technique, and also reduced errors due to divided attention.


user interface software and technology | 2003

TiltText : using tilt for text input to mobile phones

Daniel Wigdor; Ravin Balakrishnan

TiltText, a new technique for entering text into a mobile phone is described. The standard 12-button text entry keypad of a mobile phone forces ambiguity when the 26- letter Roman alphabet is mapped in the traditional manner onto keys 2-9. The TiltText technique uses the orientation of the phone to resolve this ambiguity, by tilting the phone in one of four directions to choose which character on a particular key to enter. We first discuss implementation strategies, and then present the results of a controlled experiment comparing TiltText to MultiTap, the most common text entry technique. The experiment included 10 participants who each entered a total of 640 phrases of text chosen from a standard corpus, over a period of about five hours. The results show that text entry speed including correction for errors using TiltText was 23% faster than MultiTap by the end of the experiment, despite a higher error rate for TiltText. TiltText is thus amongst the fastest known language-independent techniques for entering text into mobile phones.


user interface software and technology | 2008

ILoveSketch: as-natural-as-possible sketching system for creating 3d curve models

Seok-Hyung Bae; Ravin Balakrishnan; Karan Singh

We present ILoveSketch, a 3D curve sketching system that captures some of the affordances of pen and paper for professional designers, allowing them to iterate directly on concept 3D curve models. The system coherently integrates existing techniques of sketch-based interaction with a number of novel and enhanced features. Novel contributions of the system include automatic view rotation to improve curve sketchability, an axis widget for sketch surface selection, and implicitly inferred changes between sketching techniques. We also improve on a number of existing ideas such as a virtual sketchbook, simplified 2D and 3D view navigation, multi-stroke NURBS curve creation, and a cohesive gesture vocabulary. An evaluation by a professional designer shows the potential of our system for deployment within a real design process.


user interface software and technology | 2004

Multi-finger gestural interaction with 3d volumetric displays

Tovi Grossman; Daniel Wigdor; Ravin Balakrishnan

Volumetric displays provide interesting opportunities and challenges for 3D interaction and visualization, particularly when used in a highly interactive manner. We explore this area through the design and implementation of techniques for interactive direct manipulation of objects with a 3D volumetric display. Motion tracking of the users fingers provides for direct gestural interaction with the virtual objects, through manipulations on and around the displays hemispheric enclosure. Our techniques leverage the unique features of volumetric displays, including a 360° viewing volume that enables manipulation from any viewpoint around the display, as well as natural and accurate perception of true depth information in the displayed 3D scene. We demonstrate our techniques within a prototype 3D geometric model building application.


human factors in computing systems | 1997

The Rockin'Mouse: integral 3D manipulation on a plane

Ravin Balakrishnan; Thomas Baudel; Gordon Kurtenbach; George W. Fitzmaurice

A novel input device called the Rockin’Mouse is described and evaluated. The Rockin’Mouse is a four degree-of-freedom input device that has the same shape as a regular mouse except that the bottom of the Rockin’Mouse is rounded so that it can be tilted. This tilting can be used to control two extra degrees of freedom, thus making it suitable for manipulation in 3D environments. Like the regular mouse, the Rockin’Mouse can sense planar position and perform all the usual functions. However, in a 3D scene a regular mouse can only operate on 2 dimensions at a time and therefore manipulation in 3D requires a way to switch between dimensions. With the Rockin’Mouse, however, all the dimensions can be simultaneously controlled. In this paper we describe our design rationale behind the Rockin’Mouse, and present an experiment which compares the Rockin’Mouse to the standard mouse in a typical 3D interaction task. Our results indicate that the Rockin’Mouse is 30% faster and is a promising device for both 2D and 3D interaction.

Collaboration


Dive into the Ravin Balakrishnan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Clifton Forlines

Mitsubishi Electric Research Laboratories

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jian Zhao

University of Toronto

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge