Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jason K. Johnson is active.

Publication


Featured researches published by Jason K. Johnson.


IEEE Transactions on Signal Processing | 2008

Estimation in Gaussian Graphical Models Using Tractable Subgraphs: A Walk-Sum Analysis

Venkat Chandrasekaran; Jason K. Johnson; Alan S. Willsky

Graphical models provide a powerful formalism for statistical signal processing. Due to their sophisticated modeling capabilities, they have found applications in a variety of fields such as computer vision, image processing, and distributed sensor networks. In this paper, we present a general class of algorithms for estimation in Gaussian graphical models with arbitrary structure. These algorithms involve a sequence of inference problems on tractable subgraphs over subsets of variables. This framework includes parallel iterations such as embedded trees, serial iterations such as block Gauss-Seidel, and hybrid versions of these iterations. We also discuss a method that uses local memory at each node to overcome temporary communication failures that may arise in distributed sensor network applications. We analyze these algorithms based on the recently developed walk-sum interpretation of Gaussian inference. We describe the walks ldquocomputedrdquo by the algorithms using walk-sum diagrams, and show that for iterations based on a very large and flexible set of sequences of subgraphs, convergence is guaranteed in walk-summable models. Consequently, we are free to choose spanning trees and subsets of variables adaptively at each iteration. This leads to efficient methods for optimizing the next iteration step to achieve maximum reduction in error. Simulation results demonstrate that these nonstationary algorithms provide a significant speedup in convergence over traditional one-tree and two-tree iterations.


IEEE Transactions on Signal Processing | 2008

Low-Rank Variance Approximation in GMRF Models: Single and Multiscale Approaches

Dmitry M. Malioutov; Jason K. Johnson; Myung Jin Choi; Alan S. Willsky

We present a versatile framework for tractable computation of approximate variances in large-scale Gaussian Markov random field estimation problems. In addition to its efficiency and simplicity, it also provides accuracy guarantees. Our approach relies on the construction of a certain low-rank aliasing matrix with respect to the Markov graph of the model. We first construct this matrix for single-scale models with short-range correlations and then introduce spliced wavelets and propose a construction for the long-range correlation case, and also for multiscale models. We describe the accuracy guarantees that the approach provides and apply the method to a large interpolation problem from oceanography with sparse, irregular, and noisy measurements, and to a gravity inversion problem.


international conference on acoustics, speech, and signal processing | 2006

Low-Rank Variance Estimation in Large-Scale Gmrf Models

Dmitry M. Malioutov; Jason K. Johnson; Alan S. Willsky

We consider the problem of variance estimation in large-scale Gauss-Markov random field (GMRF) models. While approximate mean estimates can be obtained efficiently for sparse GMRFs of very large size, computing the variances is a challenging problem. We propose a simple rank-reduced method which exploits the graph structure and the correlation length in the model to compute approximate variances with linear complexity in the number of nodes. The method has a separation length parameter trading off complexity versus estimation accuracy. For models with bounded correlation length, we efficiently compute provably accurate variance estimates


IEEE Transactions on Image Processing | 2008

A Recursive Model-Reduction Method for Approximate Inference in Gaussian Markov Random Fields

Jason K. Johnson; Alan S. Willsky

This paper presents recursive cavity modeling - a principled, tractable approach to approximate, near-optimal inference for large Gauss-Markov random fields. The main idea is to subdivide the random field into smaller subfields, constructing cavity models which approximate these subfields. Each cavity model is a concise, yet faithful, model for the surface of one subfield sufficient for near-optimal inference in adjacent subfields. This basic idea leads to a tree-structured algorithm which recursively builds a hierarchy of cavity models during an ldquoupward passrdquo and then builds a complementary set of blanket models during a reverse ldquodownward pass.rdquo The marginal statistics of individual variables can then be approximated using their blanket models. Model thinning plays an important role, allowing us to develop thinned cavity and blanket models thereby providing tractable approximate inference. We develop a maximum-entropy approach that exploits certain tractable representations of Fisher information on thin chordal graphs. Given the resulting set of thinned cavity models, we also develop a fast preconditioner, which provides a simple iterative method to compute optimal estimates. Thus, our overall approach combines recursive inference, variational learning and iterative estimation. We demonstrate the accuracy and scalability of this approach in several challenging, large-scale remote sensing problems.


2007 IEEE/SP 14th Workshop on Statistical Signal Processing | 2007

Maximum Entropy Relaxation for Graphical Model Selection Given Inconsistent Statistics

Venkat Chandrasekaran; Jason K. Johnson; Alan S. Willsky

We develop a novel approach to approximate a specified collection of marginal distributions on subsets of variables by a globally consistent distribution on the entire collection of variables. In general, the specified marginal distributions may be inconsistent on overlapping subsets of variables. Our method is based on maximizing entropy over an exponential family of graphical models, subject to divergence constraints on small subsets of variables that enforce closeness to the specified marginals. The resulting optimization problem is convex, and can be solved efficiently using a primal-dual interior-point algorithm. Moreover, this framework leads naturally to a solution that is a sparse graphical model.


international conference on acoustics, speech, and signal processing | 2007

GMRF Variance Approximation using Splicedwavelet Bases

Dmitry M. Malioutov; Jason K. Johnson; Alan S. Willsky

We consider the problem of computing variances in large-scale Gauss-Markov random field (GMRF) models. In our prior work we considered the short-range correlation case, and we proposed a simple low-rank method which computes approximate variances with linear complexity in the number of nodes. In addition to its low complexity, the method has good guarantees on the quality of the approximation. In this paper we extend our method and analysis using a wavelet-based multi-scale approach which is applicable to models with much longer correlation lengths.


international symposium on information theory | 2013

Improved linear programming decoding using frustrated cycles

Shrinivas Kudekar; Jason K. Johnson; Michael Chertkov

We consider data transmission over a binary-input additive white Gaussian noise channel using low-density parity-check codes. One of the most popular techniques for decoding low-density parity-check codes is the linear programming decoder. In general, the linear programming decoder is suboptimal. In this paper we present a systematic approach to enhance the linear programming decoder. More precisely, in the cases where the linear program outputs a fractional solution, we give a simple algorithm to identify frustrated cycles which cause the output of the linear program to be fractional. Then adding these cycles, adaptively to the basic linear program, we show improved word error rate performance.


Journal of Machine Learning Research | 2006

Walk-Sums and Belief Propagation in Gaussian Graphical Models

Dmitry M. Malioutov; Jason K. Johnson; Alan S. Willsky


arXiv: Artificial Intelligence | 2007

Lagrangian Relaxation for MAP Estimation in Graphical Models

Jason K. Johnson; Dmitry M. Malioutov; Alan S. Willsky


neural information processing systems | 2005

Walk-Sum Interpretation and Analysis of Gaussian Belief Propagation

Dmitry M. Malioutov; Alan S. Willsky; Jason K. Johnson

Collaboration


Dive into the Jason K. Johnson's collaboration.

Top Co-Authors

Avatar

Alan S. Willsky

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Dmitry M. Malioutov

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Venkat Chandrasekaran

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Michael Chertkov

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Myung Jin Choi

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Diane Oyen

Los Alamos National Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge