Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jonathan H. Huggins is active.

Publication


Featured researches published by Jonathan H. Huggins.


Journal of Computational and Graphical Statistics | 2014

Fast Kalman Filtering and Forward–Backward Smoothing via a Low-Rank Perturbative Approach

Eftychios A. Pnevmatikakis; Kamiar Rahnama Rad; Jonathan H. Huggins; Liam Paninski

Kalman filtering-smoothing is a fundamental tool in statistical time-series analysis. However, standard implementations of the Kalman filter-smoother require O(d3) time and O(d2) space per time step, where d is the dimension of the state variable, and are therefore impractical in high-dimensional problems. In this article we note that if a relatively small number of observations are available per time step, the Kalman equations may be approximated in terms of a low-rank perturbation of the prior state covariance matrix in the absence of any observations. In many cases this approximation may be computed and updated very efficiently (often in just O(k2d) or O(k2d + kdlog d) time and space per time step, where k is the rank of the perturbation and in general k ≪ d), using fast methods from numerical linear algebra. We justify our approach and give bounds on the rank of the perturbation as a function of the desired accuracy. For the case of smoothing, we also quantify the error of our algorithm because of the low-rank approximation and show that it can be made arbitrarily low at the expense of a moderate computational cost. We describe applications involving smoothing of spatiotemporal neuroscience data. This article has online supplementary material.


Journal of Computational Neuroscience | 2014

Fast state-space methods for inferring dendritic synaptic connectivity

Ari Pakman; Jonathan H. Huggins; Carl Smith; Liam Paninski

AbstractWe present fast methods for filtering voltage measurements and performing optimal inference of the location and strength of synaptic connections in large dendritic trees. Given noisy, subsampled voltage observations we develop fast l1-penalized regression methods for Kalman state-space models of the neuron voltage dynamics. The value of the l1-penalty parameter is chosen using cross-validation or, for low signal-to-noise ratio, a Mallows’ Cp-like criterion. Using low-rank approximations, we reduce the inference runtime from cubic to linear in the number of dendritic compartments. We also present an alternative, fully Bayesian approach to the inference problem using a spike-and-slab prior. We illustrate our results with simulations on toy and real neuronal geometries. We consider observation schemes that either scan the dendritic geometry uniformly or measure linear combinations of voltages across several locations with random coefficients. For the latter, we show how to choose the coefficients to offset the correlation between successive measurements imposed by the neuron dynamics. This results in a “compressed sensing” observation scheme, with an important reduction in the number of measurements required to infer the synaptic weights.


arXiv: Statistics Theory | 2015

Sequential Monte Carlo as Approximate Sampling: bounds, adaptive resampling via

Jonathan H. Huggins; Daniel M. Roy


neural information processing systems | 2016

\infty

Jonathan H. Huggins; Trevor Campbell; Tamara Broderick


international conference on artificial intelligence and statistics | 2017

-ESS, and an application to Particle Gibbs

Jonathan H. Huggins; James Zou


arXiv: Statistics Theory | 2015

Coresets for Scalable Bayesian Logistic Regression

Jonathan H. Huggins; Daniel M. Roy


international conference on machine learning | 2015

Quantifying the accuracy of approximate diffusions and Markov chains

Jonathan H. Huggins; Karthik Narasimhan; Ardavan Saeedi; Vikash K. Mansinghka


international conference on machine learning | 2015

CONVERGENCE OF SEQUENTIAL MONTE CARLO-BASED SAMPLING METHODS

Jonathan H. Huggins; Joshua B. Tenenbaum


arXiv: Statistics Theory | 2016

JUMP-Means: Small-Variance Asymptotics for Markov Jump Processes

Trevor Campbell; Jonathan H. Huggins; Jonathan P. How; Tamara Broderick


arXiv: Machine Learning | 2016

Risk and Regret of Hierarchical Bayesian Learners

Ryan Giordano; Tamara Broderick; Rachael Meager; Jonathan H. Huggins; Michael I. Jordan

Collaboration


Dive into the Jonathan H. Huggins's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Trevor Campbell

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ardavan Saeedi

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge