Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Tengyu Ma is active.

Publication


Featured researches published by Tengyu Ma.


symposium on the theory of computing | 2017

Finding approximate local minima faster than gradient descent

Naman Agarwal; Zeyuan Allen-Zhu; Brian Bullins; Elad Hazan; Tengyu Ma

We design a non-convex second-order optimization algorithm that is guaranteed to return an approximate local minimum in time which scales linearly in the underlying dimension and the number of training examples. The time complexity of our algorithm to find an approximate local minimum is even faster than that of gradient descent to find a critical point. Our algorithm applies to a general class of optimization problems including training a neural network and other non-convex objectives arising in machine learning.


foundations of computer science | 2016

Polynomial-Time Tensor Decompositions with Sum-of-Squares

Tengyu Ma; Jonathan Shi; David Steurer

We give new algorithms based on the sum-of-squares method for tensor decomposition. Our results improve the best known running times from quasi-polynomial to polynomial for several problems, including decomposing random overcomplete 3-tensors and learning overcomplete dictionaries with constant relative sparsity. We also give the first robust analysis for decomposing overcomplete 4-tensors in the smoothed analysis model. A key ingredient of our analysis is to establish small spectral gaps in moment matrices derived from solutions to sum-of-squares relaxations. To enable this analysis we augment sum-of-squaresrelaxations with spectral analogs of maximum entropy constraints.


international workshop and international workshop on approximation randomization and combinatorial optimization algorithms and techniques | 2015

Decomposing Overcomplete 3rd Order Tensors using Sum-of-Squares Algorithms

Rong Ge; Tengyu Ma

Tensor rank and low-rank tensor decompositions have many applications in learning and complexity theory. Most known algorithms use unfoldings of tensors and can only handle rank up to


symposium on the theory of computing | 2018

Generalization and equilibrium in generative adversarial nets (GANs) (invited talk)

Tengyu Ma

n^{\lfloor p/2 \rfloor}


international conference on machine learning | 2014

Provable Bounds for Learning Some Deep Representations

Sanjeev Arora; Aditya Bhaskara; Rong Ge; Tengyu Ma

for a


neural information processing systems | 2016

Matrix Completion has No Spurious Local Minimum

Rong Ge; Jason D. Lee; Tengyu Ma

p


international conference on learning representations | 2017

A Simple but Tough-to-Beat Baseline for Sentence Embeddings

Sanjeev Arora; Yingyu Liang; Tengyu Ma

-th order tensor in


international conference on machine learning | 2017

Generalization and Equilibrium in Generative Adversarial Nets (GANs)

Sanjeev Arora; Rong Ge; Yingyu Liang; Tengyu Ma; Yi Zhang

\mathbb{R}^{n^p}


international conference on learning representations | 2017

Identity Matters in Deep Learning

Moritz Hardt; Tengyu Ma

. Previously no efficient algorithm can decompose 3rd order tensors when the rank is super-linear in the dimension. Using ideas from sum-of-squares hierarchy, we give the first quasi-polynomial time algorithm that can decompose a random 3rd order tensor decomposition when the rank is as large as


Journal of Machine Learning Research | 2015

Simple, efficient, and neural algorithms for sparse coding

Sanjeev Arora; Rong Ge; Tengyu Ma; Ankur Moitra

n^{3/2}/\textrm{polylog} n

Collaboration


Dive into the Tengyu Ma's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jason D. Lee

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ankur Moitra

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge