Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Maziar Raissi is active.

Publication


Featured researches published by Maziar Raissi.


Journal of Computational Physics | 2018

Hidden physics models: Machine learning of nonlinear partial differential equations

Maziar Raissi; George Em Karniadakis

Abstract While there is currently a lot of enthusiasm about “big data”, useful data is usually “small” and expensive to acquire. In this paper, we present a new paradigm of learning partial differential equations from small data. In particular, we introduce hidden physics models, which are essentially data-efficient learning machines capable of leveraging the underlying laws of physics, expressed by time dependent and nonlinear partial differential equations, to extract patterns from high-dimensional data generated from experiments. The proposed methodology may be applied to the problem of learning, system identification, or data-driven discovery of partial differential equations. Our framework relies on Gaussian processes, a powerful tool for probabilistic inference over functions, that enables us to strike a balance between model complexity and data fitting. The effectiveness of the proposed approach is demonstrated through a variety of canonical problems, spanning a number of scientific domains, including the Navier–Stokes, Schrodinger, Kuramoto–Sivashinsky, and time dependent linear fractional equations. The methodology provides a promising new direction for harnessing the long-standing developments of classical methods in applied mathematics and mathematical physics to design learning machines with the ability to operate in complex domains without requiring large quantities of data.


Journal of Computational Physics | 2017

Inferring solutions of differential equations using noisy multi-fidelity data

Maziar Raissi; Paris Perdikaris; George Em Karniadakis

For more than two centuries, solutions of differential equations have been obtained either analytically or numerically based on typically well-behaved forcing and boundary conditions for well-posed problems. We are changing this paradigm in a fundamental way by establishing an interface between probabilistic machine learning and differential equations. We develop data-driven algorithms for general linear equations using Gaussian process priors tailored to the corresponding integro-differential operators. The only observables are scarce noisy multi-fidelity data for the forcing and solution that are not required to reside on the domain boundary. The resulting predictive posterior distributions quantify uncertainty and naturally lead to adaptive solution refinement via active learning. This general framework circumvents the tyranny of numerical discretization as well as the consistency and stability issues of time-integration, and is scalable to high-dimensions.


SIAM Journal on Scientific Computing | 2018

Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations

Maziar Raissi; Paris Perdikaris; George Em Karniadakis

We introduce the concept of numerical Gaussian processes, which we define as Gaussian processes with covariance functions resulting from temporal discretization of time-dependent partial differential equations. Numerical Gaussian processes, by construction, are designed to deal with cases where (a) all we observe are noisy data on black-box initial conditions, and (b) we are interested in quantifying the uncertainty associated with such noisy data in our solutions to time-dependent partial differential equations. Our method circumvents the need for spatial discretization of the differential operators by proper placement of Gaussian process priors. This is an attempt to construct structured and data-efficient learning machines, which are explicitly informed by the underlying physics that possibly generated the observed data. The effectiveness of the proposed approach is demonstrated through several benchmark problems involving linear and nonlinear time-dependent operators. In all examples, we are able to r...


Journal of Computational Physics | 2017

Machine learning of linear differential equations using Gaussian processes

Maziar Raissi; Paris Perdikaris; George Em Karniadakis

This work leverages recent advances in probabilistic machine learning to discover governing equations expressed by parametric linear operators. Such equations involve, but are not limited to, ordinary and partial differential, integro-differential, and fractional order operators. Here, Gaussian process priors are modified according to the particular form of such operators and are employed to infer parameters of the linear equations from scarce and possibly noisy observations. Such observations may come from experiments or black-box computer simulations, as demonstrated in several synthetic examples and a realistic application in functional genomics. Employ probabilistic machine learning to discover governing equations expressed by parametric linear operators.Proper placement of Gaussian process priors allows one to efficiently infer model parameters via maximum likelihood estimation.A general treatment of inverse problems governed by linear operators, leading to model discovery from just a handful of noisy measurements.


Proceedings of the Royal Society A: Mathematical, Physical and Engineering Science | 2017

Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling

Paris Perdikaris; Maziar Raissi; Andreas C. Damianou; Neil D. Lawrence; George Em Karniadakis

Multi-fidelity modelling enables accurate inference of quantities of interest by synergistically combining realizations of low-cost/low-fidelity models with a small set of high-fidelity observations. This is particularly effective when the low- and high-fidelity models exhibit strong correlations, and can lead to significant computational gains over approaches that solely rely on high-fidelity models. However, in many cases of practical interest, low-fidelity models can only be well correlated to their high-fidelity counterparts for a specific range of input parameters, and potentially return wrong trends and erroneous predictions if probed outside of their validity regime. Here we put forth a probabilistic framework based on Gaussian process regression and nonlinear autoregressive schemes that is capable of learning complex nonlinear and space-dependent cross-correlations between models of variable fidelity, and can effectively safeguard against low-fidelity models that provide wrong trends. This introduces a new class of multi-fidelity information fusion algorithms that provide a fundamental extension to the existing linear autoregressive methodologies, while still maintaining the same algorithmic complexity and overall computational cost. The performance of the proposed methods is tested in several benchmark problems involving both synthetic and real multi-fidelity datasets from computational fluid dynamics simulations.


International Journal of Computer Mathematics | 2018

Application of local improvements to reduced-order models to sampling methods for nonlinear PDEs with noise

Maziar Raissi; Padmanabhan Seshaiyer

ABSTRACT In this work, we extend upon the results of Raissi and Seshaiyer [A multi-fidelity stochastic collocation method for parabolic partial differential equations with random input data, Int. J. Uncertain. Quantif. 4(3) (2014), pp. 225–242]. In Raissi and Seshaiyer (2014), the authors propose to use deterministic model reduction techniques to enhance the performance of sampling methods like Monte-Carlo or stochastic collocation. However, in order to be able to apply the method proposed in Raissi and Seshaiyer (2014) to non-linear problems a crucial step needs to be taken. This step involves local improvements to reduced-order models. This paper is an illustration of the importance of this step. Local improvements to reduced-order models are achieved using sensitivity analysis of the proper orthogonal decomposition.


Energy Economics | 2012

The Differential Effects of Oil Demand and Supply Shocks on the Global Economy

Paul Cashin; Kamiar Mohaddes; Maziar Raissi; Mehdi Raissi


arXiv: Artificial Intelligence | 2017

Physics Informed Deep Learning (Part II): Data-driven Discovery of Nonlinear Partial Differential Equations.

Maziar Raissi; Paris Perdikaris; George Em Karniadakis


arXiv: Dynamical Systems | 2018

Multistep Neural Networks for Data-driven Discovery of Nonlinear Dynamical Systems

Maziar Raissi; Paris Perdikaris; George Em Karniadakis


arXiv: Machine Learning | 2017

Parametric Gaussian Process Regression for Big Data.

Maziar Raissi

Collaboration


Dive into the Maziar Raissi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Paris Perdikaris

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mehdi Raissi

International Monetary Fund

View shared research outputs
Top Co-Authors

Avatar

Michael S. Triantafyllou

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Paul Cashin

International Monetary Fund

View shared research outputs
Top Co-Authors

Avatar

Zhicheng Wang

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge