Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Bingni W. Brunton is active.

Publication


Featured researches published by Bingni W. Brunton.


PLOS ONE | 2016

Koopman Invariant Subspaces and Finite Linear Representations of Nonlinear Dynamical Systems for Control.

Steven L. Brunton; Bingni W. Brunton; Joshua L. Proctor; J. Nathan Kutz

In this work, we explore finite-dimensional linear representations of nonlinear dynamical systems by restricting the Koopman operator to an invariant subspace spanned by specially chosen observable functions. The Koopman operator is an infinite-dimensional linear operator that evolves functions of the state of a dynamical system. Dominant terms in the Koopman expansion are typically computed using dynamic mode decomposition (DMD). DMD uses linear measurements of the state variables, and it has recently been shown that this may be too restrictive for nonlinear systems. Choosing the right nonlinear observable functions to form an invariant subspace where it is possible to obtain linear reduced-order models, especially those that are useful for control, is an open challenge. Here, we investigate the choice of observable functions for Koopman analysis that enable the use of optimal linear control techniques on nonlinear problems. First, to include a cost on the state of the system, as in linear quadratic regulator (LQR) control, it is helpful to include these states in the observable subspace, as in DMD. However, we find that this is only possible when there is a single isolated fixed point, as systems with multiple fixed points or more complicated attractors are not globally topologically conjugate to a finite-dimensional linear system, and cannot be represented by a finite-dimensional linear Koopman subspace that includes the state. We then present a data-driven strategy to identify relevant observable functions for Koopman analysis by leveraging a new algorithm to determine relevant terms in a dynamical system by ℓ1-regularized regression of the data in a nonlinear function space; we also show how this algorithm is related to DMD. Finally, we demonstrate the usefulness of nonlinear observable subspaces in the design of Koopman operator optimal control laws for fully nonlinear systems using techniques from linear optimal control.


Nature Communications | 2017

Chaos as an intermittently forced linear system

Steven L. Brunton; Bingni W. Brunton; Joshua L. Proctor; Eurika Kaiser; J. Nathan Kutz

Understanding the interplay of order and disorder in chaos is a central challenge in modern quantitative science. Approximate linear representations of nonlinear dynamics have long been sought, driving considerable interest in Koopman theory. We present a universal, data-driven decomposition of chaos as an intermittently forced linear system. This work combines delay embedding and Koopman theory to decompose chaotic dynamics into a linear model in the leading delay coordinates with forcing by low-energy delay coordinates; this is called the Hankel alternative view of Koopman (HAVOK) analysis. This analysis is applied to the Lorenz system and real-world examples including Earth’s magnetic field reversal and measles outbreaks. In each case, forcing statistics are non-Gaussian, with long tails corresponding to rare intermittent forcing that precedes switching and bursting phenomena. The forcing activity demarcates coherent phase space regions where the dynamics are approximately linear from those that are strongly nonlinear.The huge amount of data generated in fields like neuroscience or finance calls for effective strategies that mine data to reveal underlying dynamics. Here Brunton et al.develop a data-driven technique to analyze chaotic systems and predict their dynamics in terms of a forced linear model.


Journal of Computational Physics | 2018

Sparsity enabled cluster reduced-order models for control

Eurika Kaiser; Marek Morzyński; Guillaume Daviller; J. Nathan Kutz; Bingni W. Brunton; Steven L. Brunton

Abstract Characterizing and controlling nonlinear, multi-scale phenomena are central goals in science and engineering. Cluster-based reduced-order modeling (CROM) was introduced to exploit the underlying low-dimensional dynamics of complex systems. CROM builds a data-driven discretization of the Perron–Frobenius operator, resulting in a probabilistic model for ensembles of trajectories. A key advantage of CROM is that it embeds nonlinear dynamics in a linear framework, which enables the application of standard linear techniques to the nonlinear system. CROM is typically computed on high-dimensional data; however, access to and computations on this full-state data limit the online implementation of CROM for prediction and control. Here, we address this key challenge by identifying a small subset of critical measurements to learn an efficient CROM, referred to as sparsity-enabled CROM. In particular, we leverage compressive measurements to faithfully embed the cluster geometry and preserve the probabilistic dynamics. Further, we show how to identify fewer optimized sensor locations tailored to a specific problem that outperform random measurements. Both of these sparsity-enabled sensing strategies significantly reduce the burden of data acquisition and processing for low-latency in-time estimation and control. We illustrate this unsupervised learning approach on three different high-dimensional nonlinear dynamical systems from fluids with increasing complexity, with one application in flow control. Sparsity-enabled CROM is a critical facilitator for real-time implementation on high-dimensional systems where full-state information may be inaccessible.


Proceedings of the National Academy of Sciences of the United States of America | 2018

Neural-inspired sensors enable sparse, efficient classification of spatiotemporal data

Thomas Mohren; Thomas L. Daniel; Steven L. Brunton; Bingni W. Brunton

Significance Winged insects perform remarkable aerial feats in uncertain, complex fluid environments. This ability is enabled by sensation of mechanical forces to inform rapid corrections in body orientation. Curiously, mechanoreceptor neurons do not faithfully report forces; instead, they are activated by specific time histories of forcing. We find that, far from being a bug, neural encoding by biological sensors is a feature that acts as built-in temporal filtering superbly matched to detect body rotation. Indeed, this encoding further enables surprisingly efficient detection using only a small handful of neurons at key locations. Nature suggests smart data as an alternative strategy to big data, and neural-inspired sensors establish a paradigm in hyperefficient sensing of complex systems. Sparse sensor placement is a central challenge in the efficient characterization of complex systems when the cost of acquiring and processing data is high. Leading sparse sensing methods typically exploit either spatial or temporal correlations, but rarely both. This work introduces a sparse sensor optimization that is designed to leverage the rich spatiotemporal coherence exhibited by many systems. Our approach is inspired by the remarkable performance of flying insects, which use a few embedded strain-sensitive neurons to achieve rapid and robust flight control despite large gust disturbances. Specifically, we identify neural-inspired sensors at a few key locations on a flapping wing that are able to detect body rotation. This task is particularly challenging as the rotational twisting mode is three orders of magnitude smaller than the flapping modes. We show that nonlinear filtering in time, built to mimic strain-sensitive neurons, is essential to detect rotation, whereas instantaneous measurements fail. Optimized sparse sensor placement results in efficient classification with approximately 10 sensors, achieving the same accuracy and noise robustness as full measurements consisting of hundreds of sensors. Sparse sensing with neural-inspired encoding establishes an alternative paradigm in hyperefficient, embodied sensing of spatiotemporal data and sheds light on principles of biological sensing for agile flight control.


Archive | 2017

Data-Driven Methods in Fluid Dynamics: Sparse Classification from Experimental Data

Zhe Bai; Steven L. Brunton; Bingni W. Brunton; J. Nathan Kutz; Eurika Kaiser; Andreas Spohn; Bernd R. Noack

This work explores the use of data-driven methods, including machine learning and sparse sampling, for systems in fluid dynamics. In particular, camera images of a transitional separation bubble are used with dimensionality reduction and supervised classification techniques to discriminate between an actuated and an unactuated flow. After classification is demonstrated on full-resolution image data, similar classification performance is obtained using heavily subsampled pixels from the images. Finally, a sparse sensor optimization based on compressed sensing is used to determine optimal pixel locations for accurate classification. With 5–10 specially selected sensors, the median cross-validated classification accuracy is ≥ 97 %, as opposed to a random set of 5–10 pixels, which results in classification accuracy of 70–80 %. The methods developed here apply broadly to high-dimensional data from fluid dynamics experiments. Relevant connections between sparse sampling and the representation of high-dimensional data in a low-rank feature space are discussed.


bioRxiv | 2018

Extracting Reproducible Time-Resolved Resting State Networks using Dynamic Mode Decomposition

James M. Kunert-Graf; Kristian M. Eschenburg; David J. Galas; J. Nathan Kutz; Swati Rane; Bingni W. Brunton

Resting state networks (RSNs) extracted from functional magnetic resonance imaging (fMRI) scans are believed to reflect the intrinsic organization and network structure of brain regions. Most traditional methods for computing RSNs typically assume these functional networks are static throughout the duration of a scan lasting 5–15 minutes. However, they are known to vary on timescales ranging from seconds to years; in addition, the dynamic properties of RSNs are affected in a wide variety of neurological disorders. Recently, there has been a proliferation of methods for characterizing RSN dynamics, yet it remains a challenge to extract reproducible time-resolved networks. In this paper, we develop a novel method based on dynamic mode decomposition (DMD) to extract networks from short windows of noisy, high-dimensional fMRI data, allowing RSNs from single scans to be resolved robustly at a temporal resolution of seconds. We demonstrate this method on data from 120 individuals from the Human Connectome Project and show that unsupervised clustering of DMD modes discovers RSNs at both the group (gDMD) and the single subject (sDMD) levels. The gDMD modes closely resemble canonical RSNs. Compared to established methods, sDMD modes capture individualized RSN structure that both better resembles the population RSN and better captures subject-level variation. We further leverage this time-resolved sDMD analysis to infer occupancy and transitions among RSNs with high reproducibility. This automated DMD-based method is a powerful tool to characterize spatial and temporal structures of RSNs in individual subjects.


international conference of the ieee engineering in medicine and biology society | 2016

Multistep model for predicting upper-limb 3D isometric force application from pre-movement electrocorticographic features

Jing Wu; Benjamin R. Shuman; Bingni W. Brunton; Katherine M. Steele; Jared D. Olson; Rajesh P. N. Rao; Jeffrey G. Ojemann

Neural correlates of movement planning onset and direction may be present in human electrocorticography in the signal dynamics of both motor and non-motor cortical regions. We use a three-stage model of jPCA reduced-rank hidden Markov model (jPCA-RR-HMM), regularized shrunken-centroid discriminant analysis (RDA), and LASSO regression to extract direction-sensitive planning information and movement onset in an upper-limb 3D isometric force task in a human subject. This mode achieves a relatively high true positive force-onset prediction rate of 60% within 250ms, and an above-chance 36% accuracy (17% chance) in predicting one of six planned 3D directions of isometric force using pre-movement signals. We also find direction-distinguishing information up to 400ms before force onset in the pre-movement signals, captured by electrodes placed over the limb-ipsilateral dorsal premotor regions. This approach can contribute to more accurate decoding of higher-level movement goals, at earlier timescales, and inform sensor placement. Our results also contribute to further understanding of the spatiotemporal features of human motor planning.Neural correlates of movement planning onset and direction may be present in human electrocorticography in the signal dynamics of both motor and non-motor cortical regions. We use a three-stage model of jPCA reduced-rank hidden Markov model (jPCA-RR-HMM), regularized shrunken-centroid discriminant analysis (RDA), and LASSO regression to extract direction-sensitive planning information and movement onset in an upper-limb 3D isometric force task in a human subject. This mode achieves a relatively high true positive force-onset prediction rate of 60% within 250ms, and an above-chance 36% accuracy (17% chance) in predicting one of six planned 3D directions of isometric force using pre-movement signals. We also find direction-distinguishing information up to 400ms before force onset in the pre-movement signals, captured by electrodes placed over the limb-ipsilateral dorsal premotor regions. This approach can contribute to more accurate decoding of higher-level movement goals, at earlier timescales, and inform sensor placement. Our results also contribute to further understanding of the spatiotemporal features of human motor planning.


arXiv: Computer Vision and Pattern Recognition | 2013

Optimal Sensor Placement and Enhanced Sparsity for Classification.

Bingni W. Brunton; Steven L. Brunton; Joshua L. Proctor; J. Nathan Kutz


Archive | 2017

Data-Driven Sparse Sensor Placement for Reconstruction

Krithika Manohar; Bingni W. Brunton; J. Nathan Kutz; Steven L. Brunton


arXiv: Optimization and Control | 2017

Data-Driven Sparse Sensor Placement.

Krithika Manohar; Bingni W. Brunton; J. Nathan Kutz; Steven L. Brunton

Collaboration


Dive into the Bingni W. Brunton's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

J. Nathan Kutz

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bernd R. Noack

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Zhe Bai

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge