Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Xinyang Yi is active.

Publication


Featured researches published by Xinyang Yi.


IEEE Transactions on Information Theory | 2018

Convex and Nonconvex Formulations for Mixed Regression With Two Components: Minimax Optimal Rates

Yudong Chen; Xinyang Yi; Constantine Caramanis

We consider the mixed regression problem with two components, under adversarial and stochastic noise. We give a convex optimization formulation that provably recovers the true solution, as well as a nonconvex formulation that works under more general settings and remains tractable. Upper bounds are provided on the recovery errors for both arbitrary noise and stochastic noise models. We also give matching minimax lower bounds (up to log factors), showing that our algorithm is information-theoretical optimal in a precise sense. Our results represent the first tractable algorithm guaranteeing successful recovery with tight bounds on recovery errors and sample complexity. Moreover, we pinpoint the statistical cost of mixtures: our minimax-optimal results indicate that the mixture poses a fundamentally more difficult problem in the low-SNR regime, where the learning rate changes.


design automation conference | 2015

Novel power grid reduction method based on L1 regularization

Ye Wang; Meng Li; Xinyang Yi; Zhao Song; Michael Orshansky; Constantine Caramanis

Model order reduction exploiting the spectral properties of the admittance matrix, known as the graph Laplacian, to control the approximation accuracy is a promising new class of approaches to power grid analysis. In this paper we introduce a method that allows a dramatic increase in the resulting graph sparsity and can handle large dense input graphs. The method is based on the observation that the information about the realistic ranges of port currents can be used to significantly improve the resulting graph sparsity. In practice, port currents cannot vary unboundedly and the estimates of peak currents are often available early in the design cycle. However, the existing methods including the sampling-based spectral sparsification approach [11] cannot utilize this information.We propose a novel framework of graph Sparsification by L1 regularization on Laplacians (SparseLL) to exploit the available range information to achieve a higher degree of sparsity and better approximation quality. By formulating the power grid reduction as a sparsity-inducing optimization problem, we leverage the recent progress in stochastic approximation and develop a stochastic gradient descent algorithm as an efficient solution. Using established benchmarks for experiments, we demonstrate that SparseLL can achieve an up to 10X edge sparsity improvement compared to the spectral sparsification approach assuming the full range of currents, with an up to 10X accuracy improvement. The running time of our algorithm also scales quite favorably due to the low complexity and fast convergence, which leads us to believe that our algorithm is highly suitable for large-scale dense problems.


neural information processing systems | 2016

Fast Algorithms for Robust PCA via Gradient Descent

Xinyang Yi; Dohyung Park; Yudong Chen; Constantine Caramanis


international conference on machine learning | 2014

Alternating Minimization for Mixed Linear Regression

Xinyang Yi; Constantine Caramanis; Sujay Sanghavi


conference on learning theory | 2014

A Convex Formulation for Mixed Regression with Two Components: Minimax Optimal Rates

Yudong Chen; Xinyang Yi; Constantine Caramanis


neural information processing systems | 2015

Optimal linear estimation under unknown nonlinear transform

Xinyang Yi; Zhaoran Wang; Constantine Caramanis; Han Liu


international conference on machine learning | 2015

Binary Embedding: Fundamental Limits and Fast Algorithm

Xinyang Yi; Constantine Caramanis; Eric Price


arXiv: Learning | 2016

Solving a mixture of many random linear equations by tensor decomposition and alternating minimization.

Xinyang Yi; Constantine Caramanis; Sujay Sanghavi


neural information processing systems | 2015

Regularized EM algorithms: a unified framework and statistical guarantees

Xinyang Yi; Constantine Caramanis


arXiv: Learning | 2015

Regularized EM Algorithms: A Unified Framework and Provable Statistical Guarantees.

Xinyang Yi; Constantine Caramanis

Collaboration


Dive into the Xinyang Yi's collaboration.

Top Co-Authors

Avatar

Constantine Caramanis

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Yudong Chen

University of California

View shared research outputs
Top Co-Authors

Avatar

Han Liu

Princeton University

View shared research outputs
Top Co-Authors

Avatar

Sujay Sanghavi

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dohyung Park

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Eric Price

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Meng Li

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Michael Orshansky

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Pradeep Ravikumar

Carnegie Mellon University

View shared research outputs
Researchain Logo
Decentralizing Knowledge