Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Xiang Cao is active.

Publication


Featured researches published by Xiang Cao.


user interface software and technology | 2009

Detecting and leveraging finger orientation for interaction with direct-touch surfaces

Feng Wang; Xiang Cao; Xiangshi Ren; Pourang Irani

Current interactions on direct-touch interactive surfaces are often modeled based on properties of the input channel that are common in traditional graphical user interfaces (GUI) such as x-y coordinate information. Leveraging additional information available on the surfaces could potentially result in richer and novel interactions. In this paper we specifically explore the role of finger orientation. This property is typically ignored in touch-based interactions partly because of the ambiguity in determining it solely from the contact shape. We present a simple algorithm that unambiguously detects the directed finger orientation vector in real-time from contact information only, by considering the dynamics of the finger landing process. Results of an experimental evaluation show that our algorithm is stable and accurate. We then demonstrate how finger orientation can be leveraged to enable novel interactions and to infer higher-level information such as hand occlusion or user position. We present a set of orientation-aware interaction techniques and widgets for direct-touch surfaces.


user interface software and technology | 2007

Multi-user interaction using handheld projectors

Xiang Cao; Clifton Forlines; Ravin Balakrishnan

Recent research on handheld projector interaction has expanded the display and interaction space of handheld devices by projecting information onto the physical environment around the user, but has mainly focused on single-user scenarios. We extend this prior single-user research to co-located multi-user interaction using multiple handheld projectors. We present a set of interaction techniques for supporting co-located collaboration with multiple handheld projectors, and discuss application scenarios enabled by them.


user interface software and technology | 2003

VisionWand: interaction techniques for large displays using a passive wand tracked in 3D

Xiang Cao; Ravin Balakrishnan

A passive wand tracked in 3D using computer vision techniques is explored as a new input mechanism for interacting with large displays. We demonstrate a variety of interaction techniques that exploit the affordances of the wand, resulting in an effective interface for large scale interaction. The lack of any buttons or other electronics on the wand presents a challenge that we address by developing a set of postures and gestures to track state and enable command input. We also describe the use of multiple wands, and posit designs for more complex wands in the future.


ieee international workshop on horizontal interactive human computer systems | 2008

ShapeTouch: Leveraging contact shape on interactive surfaces

Xiang Cao; Andrew D. Wilson; Ravin Balakrishnan; Ken Hinckley; Scott E. Hudson

Many interactive surfaces have the ability to detect the shape of hands or objects placed on them. However, shape information is typically either condensed to individual contact points or categorized as discrete gestures. This does not leverage the full expressiveness of touch input, thus limits the actions users can perform in interactive applications. We present ShapeTouch, an exploration of interactions that directly utilize the contact shape on interactive surfaces to manipulations of objects and interactors. ShapeTouch infers virtual contact forces from contact regions and motion to enable interaction with virtual objects in ways that draw upon userspsila everyday experiences of interacting with real physical objects.


user interface software and technology | 2006

Interacting with dynamically defined information spaces using a handheld projector and a pen

Xiang Cao; Ravin Balakrishnan

The recent trend towards miniaturization of projection technology indicates that handheld devices will soon have the ability to project information onto any surface, thus enabling interfaces that are not possible with current handhelds. We explore the design space of dynamically defining and interacting with multiple virtual information spaces embedded in a physical environment using a handheld projector and a passive pen tracked in 3D. We develop techniques for defining and interacting with these spaces, and explore usage scenarios.


human factors in computing systems | 2011

Grips and gestures on a multi-touch pen

Hyunyoung Song; Hrvoje Benko; François Guimbretière; Shahram Izadi; Xiang Cao; Ken Hinckley

This paper explores the interaction possibilities enabled when the barrel of a digital pen is augmented with a multi-touch sensor. We present a novel multi-touch pen (MTPen) prototype and discuss its alternate uses beyond those of a standard stylus, such as allowing new touch gestures to be performed using the index finger or thumb and detecting how users grip the device as a mechanism for mode switching. We also discuss the hardware and software implementation challenges in realizing our prototype, and showcase how one can combine different grips (tripod, relaxed tripod, sketch, wrap) and gestures (swipe and double tap) to enable new interaction techniques with the MTPen in a prototype drawing application. One specific aim is the elimination of some of the comfort problems associated with existing auxiliary controls on digital pens. Mechanical controls such as barrel buttons and barrel scroll wheels work best in only a few specific hand grips and pen rotations. Comparatively, our gestures can be successfully and comfortably performed regardless of the rotation of the pen or how the user grips it, offering greater flexibility in use. We describe a formal evaluation comparing MTPen gestures against the use of a barrel button for mode switching. This study shows that both swipe and double tap gestures are comparable in performance to commonly employed barrel buttons without its disadvantages.


conference on computer supported cooperative work | 2010

Home video communication: mediating 'closeness'

David S. Kirk; Abigail Sellen; Xiang Cao

Video-mediated communication (VMC) technologies are becoming rapidly adopted by home users. Little research has previously been conducted into why home users would choose to use VMC or their practices surrounding its use. We present the results of an interview and diary-based study of 17 people about their uses of, and attitudes towards, VMC. We highlight the artful ways in which users appropriate VMC to reconcile a desire for closeness with those with whom they communicate, and we explore the rich ways in which VMC supports different expressions of this desire. We conclude with discussions of how next-generation VMC technologies might be designed to take advantage of this understanding of human values in communicative practice.


conference on computer supported cooperative work | 2010

Telling the whole story: anticipation, inspiration and reputation in a field deployment of TellTable

Xiang Cao; Siân E. Lindley; John Helmes; Abigail Sellen

We present a field study of TellTable, a new storytelling system designed to support creativity and collaboration amongst children. The application was deployed on a multi-touch interactive table in the library of a primary school, where children could use it to create characters and scenery based on elements of the physical world (captured through photography) as well as through drawing. These could then be used to record a story which could be played back. TellTable allowed children to collaborate in devising stories that mixed the physical and the digital in creative ways and that could include themselves as characters. Additionally, the field deployment illustrated how children took inspiration from one anothers stories, how they planned elements of their own tales before using the technology, and how the fact that stories could be accessed in the library led some to become well-known and popular within the school community. The real story here, we argue, needs to take into account all that happens within the wider context of use of this system.


user interface software and technology | 2009

Mouse 2.0: multi-touch meets the mouse

Nicolas Villar; Shahram Izadi; Dan Rosenfeld; Hrvoje Benko; John Helmes; Jonathan Westhues; Steve Hodges; Eyal Ofek; Alex Butler; Xiang Cao; Billy Chen

In this paper we present novel input devices that combine the standard capabilities of a computer mouse with multi-touch sensing. Our goal is to enrich traditional pointer-based desktop interactions with touch and gestures. To chart the design space, we present five different multi-touch mouse implementations. Each explores a different touch sensing strategy, which leads to differing form-factors and hence interactive possibilities. In addition to the detailed description of hardware and software implementations of our prototypes, we discuss the relative strengths, limitations and affordances of these novel input devices as informed by the results of a preliminary user study.


interactive tabletops and surfaces | 2013

The sound of touch: on-body touch and gesture sensing based on transdermal ultrasound propagation

Adiyan Mujibiya; Xiang Cao; Desney S. Tan; Dan Morris; Shwetak N. Patel; Jun Rekimoto

Recent work has shown that the body provides an interesting interaction platform. We propose a novel sensing technique based on transdermal low-frequency ultrasound propagation. This technique enables pressure-aware continuous touch sensing as well as arm-grasping hand gestures on the human body. We describe the phenomena we leverage as well as the system that produces ultrasound signals on one part of the body and measures this signal on another. The measured signal varies according to the measurement location, forming distinctive propagation profiles which are useful to infer on-body touch locations and on-body gestures. We also report on a series of experimental studies with 20 participants that characterize the signal, and show robust touch and gesture classification along the forearm.

Collaboration


Dive into the Xiang Cao's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge