Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ruei-Sung Lin is active.

Publication


Featured researches published by Ruei-Sung Lin.


International Journal of Computer Vision | 2008

Incremental Learning for Robust Visual Tracking

David A. Ross; Jongwoo Lim; Ruei-Sung Lin; Ming-Hsuan Yang

Abstract Visual tracking, in essence, deals with non-stationary image streams that change over time. While most existing algorithms are able to track objects well in controlled environments, they usually fail in the presence of significant variation of the object’s appearance or surrounding illumination. One reason for such failures is that many algorithms employ fixed appearance models of the target. Such models are trained using only appearance data available before tracking begins, which in practice limits the range of appearances that are modeled, and ignores the large volume of information (such as shape changes or specific lighting conditions) that becomes available during tracking. In this paper, we present a tracking method that incrementally learns a low-dimensional subspace representation, efficiently adapting online to changes in the appearance of the target. The model update, based on incremental algorithms for principal component analysis, includes two important features: a method for correctly updating the sample mean, and a forgetting factor to ensure less modeling power is expended fitting older observations. Both of these features contribute measurably to improving overall tracking performance. Numerous experiments demonstrate the effectiveness of the proposed tracking algorithm in indoor and outdoor environments where the target objects undergo large changes in pose, scale, and illumination.


international conference on computer vision | 2011

The power of comparative reasoning

Jay Yagnik; Dennis Strelow; David A. Ross; Ruei-Sung Lin

Rank correlation measures are known for their resilience to perturbations in numeric values and are widely used in many evaluation metrics. Such ordinal measures have rarely been applied in treatment of numeric features as a representational transformation. We emphasize the benefits of ordinal representations of input features both theoretically and empirically. We present a family of algorithms for computing ordinal embeddings based on partial order statistics. Apart from having the stability benefits of ordinal measures, these embeddings are highly nonlinear, giving rise to sparse feature spaces highly favored by several machine learning methods. These embeddings are deterministic, data independent and by virtue of being based on partial order statistics, add another degree of resilience to noise. These machine-learning-free methods when applied to the task of fast similarity search outperform state-of-the-art machine learning methods with complex optimization setups. For solving classification problems, the embeddings provide a nonlinear transformation resulting in sparse binary codes that are well-suited for a large class of machine learning algorithms. These methods show significant improvement on VOC 2010 using simple linear classifiers which can be trained quickly. Our method can be extended to the case of polynomial kernels, while permitting very efficient computation. Further, since the popular Min Hash algorithm is a special case of our method, we demonstrate an efficient scheme for computing Min Hash on conjunctions of binary features. The actual method can be implemented in about 10 lines of code in most languages (2 lines in MAT-LAB), and does not require any data-driven optimization.


computer vision and pattern recognition | 2010

SPEC hashing: Similarity preserving algorithm for entropy-based coding

Ruei-Sung Lin; David A. Ross; Jay Yagnik

Searching approximate nearest neighbors in large scale high dimensional data set has been a challenging problem. This paper presents a novel and fast algorithm for learning binary hash functions for fast nearest neighbor retrieval. The nearest neighbors are defined according to the semantic similarity between the objects. Our method uses the information of these semantic similarities and learns a hash function with binary code such that only objects with high similarity have small Hamming distance. The hash function is incrementally trained one bit at a time, and as bits are added to the hash code Hamming distances between dissimilar objects increase. We further link our method to the idea of maximizing conditional entropy among pair of bits and derive an extremely efficient linear time hash learning algorithm. Experiments on similar image retrieval and celebrity face recognition show that our method produces apparent improvement in performance over some state-of-the-art methods.


Archive | 2013

Method to Predict a Communicative Action that is Most Likely to be Executed Given a Context

Anna Lynn Patterson; Hrishikesh Aradhye; Wei Hua; Daniel Lehmann; Ruei-Sung Lin


Archive | 2012

Native machine learning service for user adaptation on a mobile platform

Hrishikesh Aradhye; Wei Hua; Ruei-Sung Lin


Archive | 2014

Systems and methods for prioritizing notifications on mobile devices

Hrishikesh Aradhye; Wei Hua; Ruei-Sung Lin; Mohammad Saberian


Archive | 2010

Method and system for entropy-based semantic hashing

Ruei-Sung Lin; David A. Ross; Jay Yagnik


Archive | 2014

Video content claiming classifier

Clifford Samaniego; David G. King; David A. Ross; Alexander Joshua Frank; Omid Madani; Kenji Arai; Ruei-Sung Lin


Archive | 2012

Automatic sequencing of video playlists based on mood classification of each video and video cluster transitions

Sanketh Shetty; Ruei-Sung Lin; David A. Ross; Hrishikesh Aradhye


Archive | 2004

Poursuite visuelle probabiliste adaptative avec mise a jour incrementielle de sous-espace

Ming-Hsuan Yang; Jongwoo Lim; David Ross; Ruei-Sung Lin

Collaboration


Dive into the Ruei-Sung Lin's collaboration.

Researchain Logo
Decentralizing Knowledge