Weiren Yu
Beihang University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Weiren Yu.
World Wide Web | 2012
Weiren Yu; Wenjie Zhang; Xuemin Lin; Qing Zhang; Jiajin Le
SimRank has become an important similarity measure to rank web documents based on a graph model on hyperlinks. The existing approaches for conducting SimRank computation adopt an iteration paradigm. The most efficient deterministic technique yields
IEEE Transactions on Multimedia | 2013
Jingjing Fu; Dan Miao; Weiren Yu; Shiqi Wang; Yan Lu; Shipeng Li
O\left(n^3\right)
international conference on data engineering | 2014
Weiren Yu; Xuemin Lin; Wenjie Zhang
worst-case time per iteration with the space requirement
international conference on data mining | 2013
Weiren Yu; Charu C. Aggarwal; Shuai Ma; Haixun Wang
O\left(n^2\right)
international acm sigir conference on research and development in information retrieval | 2013
Weiren Yu; Xuemin Lin
, where n is the number of nodes (web documents). In this paper, we propose novel optimization techniques such that each iteration takes
web age information management | 2010
Weiren Yu; Xuemin Lin; Jiajin Le
O \left(\min \left\{ n \cdot m , n^r \right\}\right)
asia-pacific web conference | 2010
Weiren Yu; Xuemin Lin; Jiajin Le
time and
Concurrency and Computation: Practice and Experience | 2016
Jianxin Li; Jianfeng Wen; Zhenying Tai; Richong Zhang; Weiren Yu
O \left( n + m \right)
ieee acm international conference utility and cloud computing | 2014
Jianxin Li; Zhenying Tai; Richong Zhang; Weiren Yu; Lu Liu
space, where m is the number of edges in a web-graph model and r ≤ log2 7. In addition, we extend the similarity transition matrix to prevent random surfers getting stuck, and devise a pruning technique to eliminate impractical similarities for each iteration. Moreover, we also develop a reordering technique combined with an over-relaxation method, not only speeding up the convergence rate of the existing techniques, but achieving I/O efficiency as well. We conduct extensive experiments on both synthetic and real data sets to demonstrate the efficiency and effectiveness of our iteration techniques.
IEEE Transactions on Big Data | 2017
Weiren Yu; Jianxin Li; Zakirul Alam Bhuiyan; Richong Zhang; Jinpeng Huai
Unlike traditional RGB video, Kinect-like depth is characterized by its large variation range and instability. As a result, traditional video compression algorithms cannot be directly applied to Kinect-like depth compression with respect to coding efficiency. In this paper, we propose a lossy Kinect-like depth compression framework based on the existing codecs, aiming to enhance the coding efficiency while preserving the depth features for further applications. In the proposed framework, the Kinect-like depth is reformed first by divisive normalized bilateral filter (DNBL) to suppress the depth noises caused by disparity normalization, and then block-level depth padding is implemented for invalid depth region compensation in collaboration with mask coding to eliminate the sharp variation caused by depth measurement failures. Before the traditional video coding, the inter-frame correlation of reformed depth is explored by proposed 2D+T prediction, in which depth volume is developed to simulate 3D volume to generate pseudo 3D prediction reference for depth uniqueness detection. The unique depth region, called active region is fed into the video encoder for traditional intra and inter prediction with residual coding, while the inactive region is skipped during depth coding. The experimental results demonstrate that our compression scheme can save 55%-85% in terms of bit cost and reduce coding complexity by 20%-65% in comparison with the traditional video compression algorithms. The visual quality of the 3D reconstruction is also improved after employing our compression scheme.