Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David Rolnick is active.

Publication


Featured researches published by David Rolnick.


Journal of Statistical Physics | 2017

Why Does Deep and Cheap Learning Work So Well

Henry W. Lin; Max Tegmark; David Rolnick

We show how the success of deep learning could depend not only on mathematics but also on physics: although well-known mathematical theorems guarantee that neural networks can approximate arbitrary functions well, the class of functions of practical interest can frequently be approximated through “cheap learning” with exponentially fewer parameters than generic ones. We explore how properties frequently encountered in physics such as symmetry, locality, compositionality, and polynomial log-probability translate into exceptionally simple neural networks. We further argue that when the statistical process generating the data is of a certain hierarchical form prevalent in physics and machine learning, a deep neural network can be more efficient than a shallow one. We formalize these claims using information theory and discuss the relation to the renormalization group. We prove various “no-flattening theorems” showing when efficient linear deep networks cannot be accurately approximated by shallow ones without efficiency loss; for example, we show that n variables cannot be multiplied using fewer than


European Journal of Combinatorics | 2017

On the classification of Stanley sequences

David Rolnick


Discrete and Computational Geometry | 2017

Quantitative Tverberg Theorems Over Lattices and Other Discrete Sets

Jesús A. De Loera; Reuben N. La Haye; David Rolnick; Pablo Soberón

2^n


international symposium on symbolic and algebraic computation | 2015

Graph-Coloring Ideals: Nullstellensatz Certificates, Gröbner Bases for Chordal Graphs, and Hardness of Gröbner Bases

Jesús A. De Loera; Susan Margulies; Michael Pernpeintner; Eric Riedl; David Rolnick; Gwen Spencer; Despina Stasi; Jon Swenson


Discrete Mathematics | 2015

On the growth of Stanley sequences

David Rolnick; Praveen S. Venkataramana

2n neurons in a single hidden layer.


Discrete Mathematics | 2016

Novel structures in Stanley sequences

Richard Moy; David Rolnick

An integer sequence is said to be 3-free if no three elements form an arithmetic progression. A Stanley sequence{an} is a 3-free sequence constructed by the greedy algorithm. Namely, given initial terms a0an1 is chosen to be the smallest such that the 3-free condition is not violated. Odlyzko and Stanley conjectured that Stanley sequences divide into two classes based on asymptotic growth: Type 1 sequences satisfy an=(nlog23) and appear well-structured, while Type 2 sequences satisfy an=(n2/logn) and appear disorderly. In this paper, we define the notion of regularity, which is based on local structure and implies Type 1 asymptotic growth. We conjecture that the reverse implication holds. We construct many classes of regular Stanley sequences, which include all known Type 1 sequences as special cases. We show how two regular sequences may be combined into another regular sequence, and how parts of a Stanley sequence may be translated while preserving regularity. Finally, we demonstrate the surprising fact that certain Stanley sequences possess proper subsets that are also Stanley sequences.


Discrete and Computational Geometry | 2017

Quantitative Combinatorial Geometry for Continuous Parameters

Jesús A. De Loera; Reuben N. La Haye; David Rolnick; Pablo Soberón

This paper presents a new variation of Tverberg’s theorem. Given a discrete set S of


arXiv: Learning | 2018

Deep Learning is Robust to Massive Label Noise

David Rolnick; Andreas Veit; Serge J. Belongie; Nir Shavit


arXiv: Metric Geometry | 2015

Quantitative Tverberg, Helly, & Carathéodory theorems

Jesús A. De Loera; R. N. La Haye; David Rolnick; Pablo Soberón

\mathbb {R}^{d}


Discrete Mathematics | 2017

Quantitative (p,q) theorems in combinatorial geometry

David Rolnick; Pablo Soberón

Collaboration


Dive into the David Rolnick's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Pablo Soberón

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Despina Stasi

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jon Swenson

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Nir Shavit

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Susan Margulies

United States Naval Academy

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge