Christopher Musco
Massachusetts Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Christopher Musco.
conference on innovations in theoretical computer science | 2015
Michael B. Cohen; Yin Tat Lee; Cameron Musco; Christopher Musco; Richard Peng; Aaron Sidford
Random sampling has become a critical tool in solving massive matrix problems. For linear regression, a small, manageable set of data rows can be randomly selected to approximate a tall, skinny data matrix, improving processing time significantly. For theoretical performance guarantees, each row must be sampled with probability proportional to its statistical leverage score. Unfortunately, leverage scores are difficult to compute. A simple alternative is to sample rows uniformly at random. While this often works, uniform sampling will eliminate critical row information for many natural instances. We take a fresh look at uniform sampling by examining what information it does preserve. Specifically, we show that uniform sampling yields a matrix that, in some sense, well approximates a large fraction of the original. While this weak form of approximation is not enough for solving linear regression directly, it is enough to compute a better approximation. This observation leads to simple iterative row sampling algorithms for matrix approximation that run in input-sparsity time and preserve row structure and sparsity at all intermediate steps. In addition to an improved understanding of uniform sampling, our main proof introduces a structural result of independent interest: we show that every matrix can be made to have low coherence by reweighting a small subset of its rows.
foundations of computer science | 2014
Michael Kapralov; Yin Tat Lee; Cameron Musco; Christopher Musco; Aaron Sidford
We present the first single pass algorithm for computing spectral sparsifiers of graphs in the dynamic semi-streaming model. Given a single pass over a stream containing insertions and deletions of edges to a graph, G, our algorithm maintains a randomized linear sketch of the incidence matrix into dimension O((1/aepsi;2) n polylog(n)). Using this sketch, the algorithm can output a (1 +/- aepsi;) spectral sparsifier for G with high probability. While O((1/aepsi;2) n polylog(n)) space algorithms are known for computing cut sparsifiers in dynamic streams [AGM12b, GKP12] and spectral sparsifiers in insertion-only streams [KL11], prior to our work, the best known single pass algorithm for maintaining spectral sparsifiers in dynamic streams required sketches of dimension a#x03A9;((1/aepsi;2) n(5/3)) [AGM14]. To achieve our result, we show that, using a coarse sparsifier of G and a linear sketch of Gs incidence matrix, it is possible to sample edges by effective resistance, obtaining a spectral sparsifier of arbitrary precision. Sampling from the sketch requires a novel application of ell2/ell2 sparse recovery, a natural extension of the ell0 methods used for cut sparsifiers in [AGM12b]. Recent work of [MP12] on row sampling for matrix approximation gives a recursive approach for obtaining the required coarse sparsifiers. Under certain restrictions, our approach also extends to the problem of maintaining a spectral approximation for a general matrix AT A given a stream of updates to rows in A.
symposium on discrete algorithms | 2017
Michael B. Cohen; Cameron Musco; Christopher Musco
We present a new algorithm for finding a near optimal low-rank approximation of a matrix
international world wide web conferences | 2018
Cameron Musco; Christopher Musco; Charalampos E. Tsourakakis
A
algorithm engineering and experimentation | 2017
Christopher Musco; Maxim Sviridenko; Justin Thaler
in
symposium on the theory of computing | 2015
Michael B. Cohen; Sam Elder; Cameron Musco; Christopher Musco; Madalina Persu
O(nnz(A))
neural information processing systems | 2015
Cameron Musco; Christopher Musco
time. Our method is based on a recursive sampling scheme for computing a representative subset of
neural information processing systems | 2017
Cameron Musco; Christopher Musco
A
Archive | 2015
Cameron Musco; Christopher Musco
s columns, which is then used to find a low-rank approximation. This approach differs substantially from prior
international conference on machine learning | 2016
Roy Frostig; Cameron Musco; Christopher Musco; Aaron Sidford
O(nnz(A))