Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stefan Winkler is active.

Publication


Featured researches published by Stefan Winkler.


IEEE Transactions on Multimedia | 2006

Perceived Audiovisual Quality of Low-Bitrate Multimedia Content

Stefan Winkler; Christof Faller

This paper studies the quality of multimedia content at very low bitrates. We carried out subjective experiments for assessing audiovisual, audio-only, and video-only quality. We selected content and encoding parameters that are typical of mobile applications. Our focus were the MPEG-4 AVC (a.k.a. H.264) and AAC coding standards. Based on these data, we first analyze the influence of video and audio coding parameters on quality. We investigate the optimal trade-off between bits allocated to audio and to video under global bitrate constraints. Finally, we explore models for the interactions between audio and video in terms of perceived audiovisual quality


international conference on human computer interaction | 2007

Vision-based projected tabletop interface for finger interactions

Peng Song; Stefan Winkler; Syed Omer Gilani; ZhiYing Zhou

We designed and implemented a vision-based projected table-top interface for finger interaction. The system offers a simple and quick setup and economic design. The projection onto the tabletop provides more comfortable and direct viewing for users, and more natural, intuitive yet flexible interaction than classical or tangible interfaces. Homography calibration techniques are used to provide geometrically compensated projections on the tabletop. A robust finger tracking algorithm is proposed to enable accurate and efficient interactions using this interface. Two applications have been implemented based on this interface.


human vision and electronic imaging conference | 2008

Motion saliency outweighs other low-level features while watching videos

Dwarikanath Mahapatra; Stefan Winkler; Shih-Cheng Yen

The importance of motion in attracting attention is well known. While watching videos, where motion is prevalent, how do we quantify the regions that are motion salient? In this paper, we investigate the role of motion in attention and compare it with the influence of other low-level features like image orientation and intensity. We propose a framework for motion saliency. In particular, we integrate motion vector information with spatial and temporal coherency to generate a motion attention map. The results show that our model achieves good performance in identifying regions that are moving and salient. We also find motion to have greater influence on saliency than other low-level features when watching videos.


IEEE Transactions on Circuits and Systems for Video Technology | 2008

A Hybrid Framework for 3-D Human Motion Tracking

Bingbing Ni; Ashraf A. Kassim; Stefan Winkler

In this paper, we present a hybrid framework for articulated 3-D human motion tracking from multiple synchronized cameras with potential uses in surveillance systems. Although the recovery of 3-D motion provides richer information for event understanding, existing methods based on either deterministic search or stochastic sampling lack robustness or efficiency. We therefore propose a hybrid sample-and-refine framework that combines both stochastic sampling and deterministic optimization to achieve a good compromise between efficiency and robustness. Similar motion patterns are used to learn a compact low-dimensional representation of the motion statistics. Sampling in a low-dimensional space is implemented during tracking, which reduces the number of particles drastically. We also incorporate a local optimization method based on simulated physical force/moment into our framework, which further improves the optimality of the tracking. Experimental results on several real human motion sequences show the accuracy and robustness of our method, which also has a higher sampling efficiency than most particle filtering-based methods.


international conference on virtual reality | 2007

User studies of a multiplayer first person shooting game with tangible and physical interaction

ZhiYing Zhou; Jefry Tedjokusumo; Stefan Winkler; Bingbing Ni

In this paper, we present a new immersive first-person shooting (FPS) game. Our system provides an intuitive way for the users to interact with the virtual world by physically moving around the real world and aiming freely with tangible objects. This encourages physical interaction between the players as they compete or collaborate with each other.


international conference on human computer interaction | 2007

A tangible game interface using projector-camera systems

Peng Song; Stefan Winkler; Jefry Tedjokusumo

We designed and implemented a tangible game interface using projector-camera systems. The system offers a simple and quick setup and economic design. The projection onto a paper board held by the user provides more direct viewing as well as more natural and flexible interaction than bulky HMDs or monitor-based game interfaces. Homography calibration techniques are used to provide geometrically compensated projections on the board with robustness and accuracy.


electronic imaging | 2007

3D surveillance system using multiple cameras

Ajay K. Mishra; Bingbing Ni; Stefan Winkler; Ashraf A. Kassim

We propose a 3D surveillance system using multiple cameras surrounding the scene. Our application is concerned with identifying humans in the scene and then identifying their postures. Such information can help with automatic threat assessment of a scene. The cameras are fully calibrated and assumed to remain fixed in their positions. Object detection and interpretation are performed completely in 3D space. Using depth information, persons can easily be separated from the background and their posture identified by matching with 3D model templates.


systems man and cybernetics | 2010

Immersive Multiplayer Games With Tangible and Physical Interaction

Jefry Tedjokusumo; Steven Zhiying Zhou; Stefan Winkler

In this paper, we present a new immersive multiplayer game system developed for two different environments, namely, virtual reality (VR) and augmented reality (AR). To evaluate our system, we developed three game applications-a first-person-shooter game (for VR and AR environments, respectively) and a sword game (for the AR environment). Our immersive system provides an intuitive way for users to interact with the VR or AR world by physically moving around the real world and aiming freely with tangible objects. This encourages physical interaction between players as they compete or collaborate with other players. Evaluation of our system consists of users subjective opinions and their objective performances. Our design principles and evaluation results can be applied to similar immersive game applications based on AR/VR.


electronic imaging | 2007

Tangible mixed-reality desktop for digital media management

Stefan Winkler; Hang Yu; ZhiYing Zhou

This paper presents a tangible mixed reality desktop that supports gesture-oriented interactions in 3D space. The system is based on computer vision techniques for hand and finger detection, without the need for attaching any devices to the user. The system consists of a pair of stereo cameras that point to a planar surface as the work bench. Using stereo triangulation, the 3D locations and directions of the users fingers are detected and tracked in the space on and above the surface. Based on our 3D finger tracking technique, we design a few simple multi-finger gestural interactions for digital media management. The system provides a convenient and user-friendly way of manipulating virtual objects in 3D space and supports seamless interactions with physical objects.


international conference on universal access in human computer interaction | 2007

Intuitive map navigation on mobile devices

Stefan Winkler; Karthik Rangaswamy; ZhiYing Zhou

In this paper, we propose intuitive motion-based interfaces for map navigation on mobile devices with built-in cameras. The interfaces are based on the visual detection of the devices self-motion. This gives people the experience of navigating maps with a virtual looking glass. We conducted a user study to evaluate the accuracy, sensitivity and responsiveness of our proposed system. Results show that users appreciate our motion-based user interface and find it more intuitive than traditional key-based controls, even though there is a learning curve.

Collaboration


Dive into the Stefan Winkler's collaboration.

Top Co-Authors

Avatar

Bingbing Ni

Shanghai Jiao Tong University

View shared research outputs
Top Co-Authors

Avatar

ZhiYing Zhou

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Ashraf A. Kassim

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Jefry Tedjokusumo

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Karthik Rangaswamy

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Peng Song

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Steven Zhiying Zhou

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Syed Omer Gilani

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Christof Faller

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Hang Yu

National University of Singapore

View shared research outputs
Researchain Logo
Decentralizing Knowledge