Arthur Szlam
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Arthur Szlam.
IEEE Signal Processing Magazine | 2017
Michael M. Bronstein; Joan Bruna; Yann LeCun; Arthur Szlam; Pierre Vandergheynst
Many scientific fields study data with an underlying structure that is non-Euclidean. Some examples include social networks in computational social sciences, sensor networks in communications, functional networks in brain imaging, regulatory networks in genetics, and meshed surfaces in computer graphics. In many applications, such geometric data are large and complex (in the case of social networks, on the scale of billions) and are natural targets for machine-learning techniques. In particular, we would like to use deep neural networks, which have recently proven to be powerful tools for a broad range of problems from computer vision, natural-language processing, and audio analysis. However, these tools have been most successful on data with an underlying Euclidean or grid-like structure and in cases where the invariances of these structures are built into networks used to model them.
ACM Transactions on Mathematical Software | 2017
Huamin Li; George C. Linderman; Arthur Szlam; Kelly P. Stanton; Yuval Kluger; Mark Tygert
Recent years have witnessed intense development of randomized methods for low-rank approximation. These methods target principal component analysis and the calculation of truncated singular value decompositions. The present article presents an essentially black-box, foolproof implementation for Mathworks’ MATLAB, a popular software platform for numerical computation. As illustrated via several tests, the randomized algorithms for low-rank approximation outperform or at least match the classical deterministic techniques (such as Lanczos iterations run to convergence) in basically all respects: accuracy, computational efficiency (both speed and memory usage), ease-of-use, parallelizability, and reliability. However, the classical procedures remain the methods of choice for estimating spectral norms and are far superior for calculating the least singular values and corresponding singular vectors (or singular subspaces).
neural information processing systems | 2015
Sainbayar Sukhbaatar; Arthur Szlam; Jason Weston; Rob Fergus
neural information processing systems | 2015
Emily L. Denton; Soumith Chintala; Arthur Szlam; Rob Fergus
neural information processing systems | 2016
Sainbayar Sukhbaatar; Arthur Szlam; Rob Fergus
international conference on learning representations | 2016
Jesse Dodge; Andreea Gane; Xiang Zhang; Antoine Bordes; Sumit Chopra; Alexander H. Miller; Arthur Szlam; Jason Weston
international conference on learning representations | 2017
Mikael Henaff; Jason Weston; Arthur Szlam; Antoine Bordes; Yann LeCun
Archive | 2015
Sainbayar Sukhbaatar; Arthur Szlam; Jason Weston; Rob Fergus
international conference on learning representations | 2018
Sainbayar Sukhbaatar; Zeming Lin; Ilya Kostrikov; Gabriel Synnaeve; Arthur Szlam; Rob Fergus
international conference on machine learning | 2016
Mikael Henaff; Arthur Szlam; Yann LeCun