Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yuanchun Shi is active.

Publication


Featured researches published by Yuanchun Shi.


interactive tabletops and surfaces | 2010

pPen: enabling authenticated pen and touch interaction on tabletop surfaces

Yongqiang Qin; Chun Yu; Hao Jiang; Chenjun Wu; Yuanchun Shi

This paper introduced pPen, a pressure-sensitive digital pen enabled precise pressure and touch input on vision based interactive tabletops. With the help of pPen inputs and feature matching technology, we implemented a novel method supporting multi-user authenticated interaction in the bimanual pen and touch scenario: login can be performed just by stroking ones signature with pPen on the table; a binding between user and pPen was created at the same time, then each interaction command made by pPen is user differentiating. We also conducted laboratory user studies, which later proved the safety and the high resistance to shoulder surfing problem: in the evaluation procedure, any attacker can never log into other users working space.


ubiquitous computing | 2011

Smart home on smart phone

Yu Zhong; Yue Suo; Wenchang Xu; Chun Yu; Xinwei Guo; Yuhang Zhao; Yuanchun Shi

Mobile phone with high accessibility and usability is regarded as the ideal interface for the users to monitor and control the approaching smart home environment. Moreover, networking technologies and protocols have been advanced enough to support a universal monitoring and controlling interface on smart phones. This paper presents HouseGenie, an interactive, direct manipulation application on mobile, which supports a range of basic home monitoring and controlling functionalities as a replacement of individual remotes of smart home appliances. HouseGenie also addresses several common requirements that may be behind the vision, such as scenario, short-delay alarm, area restriction and so on. We demonstrate that HouseGenie not only provides intuitive presentations and interactions for smart home management, but also improves user experience comparing to present solutions.


advanced visual interfaces | 2010

Structured laser pointer: enabling wrist-rolling movements as a new interactive dimension

Yongqiang Qin; Yuanchun Shi; Hao Jiang; Chun Yu

In this paper, we re-visit the issue of multi-point laser pointer interaction from a wrist-rolling perspective. Firstly, we proposed SLP---Structured Laser Pointer, and detects a laser pointers rotation along its emitting axis. SLP adds the wrist-rolling gestures as a new interactive dimension to the conventional laser pointer interaction approach. We asked a group of users to perform certain tasks using SLP, and derived from test results a set of criteria to distinguish between incidental and intentional SLP rolling, and then the experimental results also approved the high accuracy and acceptable speed as well as throughput of such rolling interaction.


international conference on human computer interaction | 2011

uPlatform: a customizable multi-user windowing system for interactive tabletop

Chenjun Wu; Yue Suo; Chun Yu; Yuanchun Shi; Yongqiang Qin

Interactive tabletop has shown great potential in facilitating face-to-face collaboration in recent years. Yet, in spite of much promising research, one important area that remains largely unexplored is the windowing system on tabletop, which can enable users to work with multiple independent or collaborative applications simultaneously. As a consequence, investigation of many scenarios such as conferencing and planning has been rather limited. To address this limitation, we present uPlatform, a multi-user windowing system specifically created for interactive tabletop. It is built based on three components: 1) an input manager for processing concurrent multi-modal inputs; 2) a window manager for controlling multi-user policies; 3) a hierarchical structure for organizing multi-task windows. All three components allow to be customized through a simple, flexible API. Based on uPlatform, three systems, uMeeting, uHome and uDining are implemented, which demonstrate its efficiency in building multi-user windowing systems on interactive tabletop.


user interface software and technology | 2011

PicoPet: "Real World" digital pet on a handheld projector

Yuhang Zhao; Chao Xue; Xiang Cao; Yuanchun Shi

We created PicoPet, a digital pet game based on mobile handheld projectors. The player can project the pet into physical environments, and the pet behaves and evolves differently according to the physical surroundings. PicoPet creates a new form of gaming experience that is directly blended into the physical world, thus could become incorporated into the players daily life as well as reflecting their lifestyle. Multiple pets projected by multiple players can also interact with each other, potentially triggering social interactions between players. In this paper, we present the design and implementation of PicoPet, as well as directions for future explorations.


human factors in computing systems | 2018

VirtualGrasp: Leveraging Experience of Interacting with Physical Objects to Facilitate Digital Object Retrieval

Yukang Yan; Chun Yu; Xiaojuan Ma; Xin Yi; Ke Sun; Yuanchun Shi

We propose VirtualGrasp, a novel gestural approach to retrieve virtual objects in virtual reality. Using VirtualGrasp, a user retrieves an object by performing a barehanded gesture as if grasping its physical counterpart. The object-gesture mapping under this metaphor is of high intuitiveness, which enables users to easily discover, remember the gestures to retrieve the objects. We conducted three user studies to demonstrate the feasibility and effectiveness of the approach. Progressively, we investigated the consensus of the object-gesture mapping across users, the expressivity of grasping gestures, and the learnability and performance of the approach. Results showed that users achieved high agreement on the mapping, with an average agreement score [35] of 0.68 (SD=0.27). Without exposure to the gestures, users successfully retrieved 76% objects with VirtualGrasp. A week after learning the mapping, they could recall the gestures for 93% objects.


ubiquitous computing | 2016

SkinMotion: what does skin movement tell us?

Yuntao Wang; Ke Sun; Lu Sun; Chun Yu; Yuanchun Shi

With the increasing popularity of wearable computing, emerging techniques allow novel interaction modalities to be transferred from portable devices to the human body itself. One promising approach is to appropriate the skin for input interface. While researches explore the potential of using the skin as an input surface, we show an alternative interaction modality - SkinMotion. SkinMotion reconstructs human motions from skin-stretching movements. In this workshop, we discuss the potential applications of SkinMotion. In addition, we explore one specific instance -- finger motion detection using the skin movement on the dorsum of the hand. Results show that SkinMotion can achieve 5.84° estimate error for proximal phalanx flexion on average. We expect SkinMotion to open new possibilities for skin-based interactions and to extend new boundaries of on-body technologies.


user interface software and technology | 2015

ATK: Enabling Ten-Finger Freehand Typing in Air Based on 3D Hand Tracking Data

Xin Yi; Chun Yu; Mingrui Zhang; Sida Gao; Ke Sun; Yuanchun Shi


ubiquitous computing | 2011

Proceedings of the 13th international conference on Ubiquitous computing

James Landay; Yuanchun Shi; Donald J. Patterson; Yvonne Rogers; Xing Xie


human factors in computing systems | 2016

One-Dimensional Handwriting: Inputting Letters and Words on Smart Glasses

Chun Yu; Ke Sun; Mingyuan Zhong; Xincheng Li; Peijun Zhao; Yuanchun Shi

Collaboration


Dive into the Yuanchun Shi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ke Sun

Tsinghua University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Xin Yi

Tsinghua University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge