Cynthia Rush
Columbia University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Cynthia Rush.
international symposium on information theory | 2015
Cynthia Rush; Adam Greig; Ramji Venkataramanan
Sparse superposition codes were recently introduced by Barron and Joseph for reliable communication over the AWGN channel at rates approaching the channel capacity. In this code, the codewords are sparse linear combinations of columns of a design matrix. In this paper, we propose an approximate message passing decoder for sparse superposition codes. The complexity of the decoder scales linearly with the size of the design matrix. The performance of the decoder is rigorously analyzed and it is shown to asymptotically achieve the AWGN capacity. We also provide simulation results to demonstrate the performance of the decoder at finite block lengths, and introduce a power allocation that significantly improves the empirical performance.
Chaos | 2010
Gregory W. Carter; Cynthia Rush; Filiz Uygun; Nikita A. Sakhanenko; David J. Galas; Timothy Galitski
Multiple high-throughput genetic interaction studies have provided substantial evidence of modularity in genetic interaction networks. However, the correspondence between these network modules and specific pathways of information flow is often ambiguous. Genetic interaction and molecular interaction analyses have not generated large-scale maps comprising multiple clearly delineated linear pathways. We seek to clarify the situation by discerning the difference between genetic modules and classical pathways. We review a method to optimize the discovery of biologically meaningful genetic modules based on a previously described context-dependent information measure to obtain maximally informative networks. We compare the results of this method with the established measures of network clustering and find that it balances global and local clustering information in networks. We further discuss the consequences for genetic interaction networks and propose a framework for the analysis of genetic modularity.
international symposium on information theory | 2016
Cynthia Rush; Ramji Venkataramanan
This paper studies the performance of Approximate Message Passing (AMP), in the regime where the problem dimension is large but finite. We consider the setting of high-dimensional regression, where the goal is to estimate a high-dimensional vector β0 from an observation y = Aβ0 + w. AMP is a low-complexity, scalable algorithm for this problem. It has the attractive feature that its performance can be accurately characterized in the asymptotic large system limit by a simple scalar iteration called state evolution. Previous proofs of the validity of state evolution have all been asymptotic convergence results. In this paper, we derive a concentration result for AMP with i.i.d. Gaussian measurement matrices with finite dimension n × N. The result shows that the probability of deviation from the state evolution prediction falls exponentially in n. Our result provides theoretical support for empirical findings that have demonstrated excellent agreement of AMP performance with state evolution predictions for moderately large dimensions.
Scientific Reports | 2016
Bruce E. Wexler; Markus Iseli; Seth Leon; William Zaggle; Cynthia Rush; Annette Goodman; A. Esat Imal; Emily Bo
Cognitive operations are supported by dynamically reconfiguring neural systems that integrate processing components widely distributed throughout the brain. The inter-neuronal connections that constitute these systems are powerfully shaped by environmental input. We evaluated the ability of computer-presented brain training games done in school to harness this neuroplastic potential and improve learning in an overall study sample of 583 second-grade children. Doing a 5-minute brain-training game immediately before math or reading curricular content games increased performance on the curricular content games. Doing three 20-minute brain training sessions per week for four months increased gains on school-administered math and reading achievement tests compared to control classes tested at the same times without intervening brain training. These results provide evidence of cognitive priming with immediate effects on learning, and longer-term brain training with far-transfer or generalized effects on academic achievement.
IEEE Transactions on Information Theory | 2017
Cynthia Rush; Adam Greig; Ramji Venkataramanan
Sparse superposition codes were recently introduced by Barron and Joseph for reliable communication over the additive white Gaussian noise (AWGN) channel at rates approaching the channel capacity. The codebook is defined in terms of a Gaussian design matrix, and codewords are sparse linear combinations of columns of the matrix. In this paper, we propose an approximate message passing decoder for sparse superposition codes, whose decoding complexity scales linearly with the size of the design matrix. The performance of the decoder is rigorously analyzed and it is shown to asymptotically achieve the AWGN capacity with an appropriate power allocation. Simulation results are provided to demonstrate the performance of the decoder at finite blocklengths. We introduce a power allocation scheme to improve the empirical performance, and demonstrate how the decoding complexity can be significantly reduced by using Hadamard design matrices.
international symposium on information theory | 2017
Cynthia Rush; Ramji Venkataramanan
Sparse regression codes (SPARCs) are a recent class of codes for reliable communication over the AWGN channel at rates approaching the channel capacity. Approximate message passing (AMP) decoding, a computationally efficient technique for decoding SPARCs, has been proven to be asymptotically capacity-achieving for the AWGN channel. In this paper, we refine the asymptotic results by deriving a large deviations bound on the probability of AMP decoding error. This bound shows that for an appropriate choice of code parameters and any fixed rate smaller than the AWGN capacity, the probability of decoding error decays exponentially in n/(log n)2T where T is the number of AMP iterations required for successful decoding. The number of iterations T is inversely proportional to the logarithm of the ratio of channel capacity to rate. For the above choice of code parameters, the complexity of the AMP decoder scales as a low-order polynomial in the block length n.
international symposium on information theory | 2017
Yanting Ma; Cynthia Rush; Dror Baron
Approximate message passing (AMP) is a class of efficient algorithms for solving high-dimensional linear regression tasks where one wishes to recover an unknown signal βο from noisy, linear measurements y = Αβ0 + w. When applying a separable denoiser at each iteration, the performance of AMP (for example, the mean squared error of its estimates) can be accurately tracked by a simple, scalar iteration referred to as state evolution. Although separable denoisers are sufficient if the unknown signal has independent and identically distributed entries, in many real-world applications, like image or audio signal reconstruction, the unknown signal contains dependencies between entries. In these cases, a coordinate-wise independence structure is not a good approximation to the true prior of the unknown signal. In this paper we assume the unknown signal has dependent entries, and using a class of non-separable sliding-window denoisers, we prove that a new form of state evolution still accurately predicts AMP performance. This is an early step in understanding the role of non-separable denoisers within AMP, and will lead to a characterization of more general denoisers in problems including compressive image reconstruction.
IEEE Transactions on Information Theory | 2018
Cynthia Rush; Ramji Venkataramanan
asilomar conference on signals, systems and computers | 2017
Dror Baron; Anna Ma; Deanna Needell; Cynthia Rush; Tina Woolf
international symposium on information theory | 2018
Kuan Hsieh; Cynthia Rush; Ramji Venkataramanan