Tengyu Ma
Princeton University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Tengyu Ma.
symposium on the theory of computing | 2017
Naman Agarwal; Zeyuan Allen-Zhu; Brian Bullins; Elad Hazan; Tengyu Ma
We design a non-convex second-order optimization algorithm that is guaranteed to return an approximate local minimum in time which scales linearly in the underlying dimension and the number of training examples. The time complexity of our algorithm to find an approximate local minimum is even faster than that of gradient descent to find a critical point. Our algorithm applies to a general class of optimization problems including training a neural network and other non-convex objectives arising in machine learning.
foundations of computer science | 2016
Tengyu Ma; Jonathan Shi; David Steurer
We give new algorithms based on the sum-of-squares method for tensor decomposition. Our results improve the best known running times from quasi-polynomial to polynomial for several problems, including decomposing random overcomplete 3-tensors and learning overcomplete dictionaries with constant relative sparsity. We also give the first robust analysis for decomposing overcomplete 4-tensors in the smoothed analysis model. A key ingredient of our analysis is to establish small spectral gaps in moment matrices derived from solutions to sum-of-squares relaxations. To enable this analysis we augment sum-of-squaresrelaxations with spectral analogs of maximum entropy constraints.
international workshop and international workshop on approximation randomization and combinatorial optimization algorithms and techniques | 2015
Rong Ge; Tengyu Ma
Tensor rank and low-rank tensor decompositions have many applications in learning and complexity theory. Most known algorithms use unfoldings of tensors and can only handle rank up to
symposium on the theory of computing | 2018
Tengyu Ma
n^{\lfloor p/2 \rfloor}
international conference on machine learning | 2014
Sanjeev Arora; Aditya Bhaskara; Rong Ge; Tengyu Ma
for a
neural information processing systems | 2016
Rong Ge; Jason D. Lee; Tengyu Ma
p
international conference on learning representations | 2017
Sanjeev Arora; Yingyu Liang; Tengyu Ma
-th order tensor in
international conference on machine learning | 2017
Sanjeev Arora; Rong Ge; Yingyu Liang; Tengyu Ma; Yi Zhang
\mathbb{R}^{n^p}
international conference on learning representations | 2017
Moritz Hardt; Tengyu Ma
. Previously no efficient algorithm can decompose 3rd order tensors when the rank is super-linear in the dimension. Using ideas from sum-of-squares hierarchy, we give the first quasi-polynomial time algorithm that can decompose a random 3rd order tensor decomposition when the rank is as large as
Journal of Machine Learning Research | 2015
Sanjeev Arora; Rong Ge; Tengyu Ma; Ankur Moitra
n^{3/2}/\textrm{polylog} n