Partial Differential Equations

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 360 Experts worldwide ranked by ideXlab platform

George Em Karniadakis - One of the best experts on this subject based on the ideXlab platform.

  • hidden physics models machine learning of nonlinear Partial Differential Equations
    Journal of Computational Physics, 2018
    Co-Authors: Maziar Raissi, George Em Karniadakis
    Abstract:

    Abstract While there is currently a lot of enthusiasm about “big data”, useful data is usually “small” and expensive to acquire. In this paper, we present a new paradigm of learning Partial Differential Equations from small data. In particular, we introduce hidden physics models, which are essentially data-efficient learning machines capable of leveraging the underlying laws of physics, expressed by time dependent and nonlinear Partial Differential Equations, to extract patterns from high-dimensional data generated from experiments. The proposed methodology may be applied to the problem of learning, system identification, or data-driven discovery of Partial Differential Equations. Our framework relies on Gaussian processes, a powerful tool for probabilistic inference over functions, that enables us to strike a balance between model complexity and data fitting. The effectiveness of the proposed approach is demonstrated through a variety of canonical problems, spanning a number of scientific domains, including the Navier–Stokes, Schrodinger, Kuramoto–Sivashinsky, and time dependent linear fractional Equations. The methodology provides a promising new direction for harnessing the long-standing developments of classical methods in applied mathematics and mathematical physics to design learning machines with the ability to operate in complex domains without requiring large quantities of data.

  • numerical gaussian processes for time dependent and nonlinear Partial Differential Equations
    SIAM Journal on Scientific Computing, 2018
    Co-Authors: Maziar Raissi, Paris Perdikaris, George Em Karniadakis
    Abstract:

    We introduce the concept of numerical Gaussian processes, which we define as Gaussian processes with covariance functions resulting from temporal discretization of time-dependent Partial Differential Equations. Numerical Gaussian processes, by construction, are designed to deal with cases where (a) all we observe are noisy data on black-box initial conditions, and (b) we are interested in quantifying the uncertainty associated with such noisy data in our solutions to time-dependent Partial Differential Equations. Our method circumvents the need for spatial discretization of the Differential operators by proper placement of Gaussian process priors. This is an attempt to construct structured and data-efficient learning machines, which are explicitly informed by the underlying physics that possibly generated the observed data. The effectiveness of the proposed approach is demonstrated through several benchmark problems involving linear and nonlinear time-dependent operators. In all examples, we are able to r...

  • physics informed deep learning part ii data driven discovery of nonlinear Partial Differential Equations
    arXiv: Artificial Intelligence, 2017
    Co-Authors: Maziar Raissi, Paris Perdikaris, George Em Karniadakis
    Abstract:

    We introduce physics informed neural networks -- neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear Partial Differential Equations. In this second part of our two-part treatise, we focus on the problem of data-driven discovery of Partial Differential Equations. Depending on whether the available data is scattered in space-time or arranged in fixed temporal snapshots, we introduce two main classes of algorithms, namely continuous time and discrete time models. The effectiveness of our approach is demonstrated using a wide range of benchmark problems in mathematical physics, including conservation laws, incompressible fluid flow, and the propagation of nonlinear shallow-water waves.

  • numerical gaussian processes for time dependent and non linear Partial Differential Equations
    arXiv: Machine Learning, 2017
    Co-Authors: Maziar Raissi, Paris Perdikaris, George Em Karniadakis
    Abstract:

    We introduce the concept of numerical Gaussian processes, which we define as Gaussian processes with covariance functions resulting from temporal discretization of time-dependent Partial Differential Equations. Numerical Gaussian processes, by construction, are designed to deal with cases where: (1) all we observe are noisy data on black-box initial conditions, and (2) we are interested in quantifying the uncertainty associated with such noisy data in our solutions to time-dependent Partial Differential Equations. Our method circumvents the need for spatial discretization of the Differential operators by proper placement of Gaussian process priors. This is an attempt to construct structured and data-efficient learning machines, which are explicitly informed by the underlying physics that possibly generated the observed data. The effectiveness of the proposed approach is demonstrated through several benchmark problems involving linear and nonlinear time-dependent operators. In all examples, we are able to recover accurate approximations of the latent solutions, and consistently propagate uncertainty, even in cases involving very long time integration.

Maziar Raissi - One of the best experts on this subject based on the ideXlab platform.

  • forward backward stochastic neural networks deep learning of high dimensional Partial Differential Equations
    arXiv: Machine Learning, 2018
    Co-Authors: Maziar Raissi
    Abstract:

    Classical numerical methods for solving Partial Differential Equations suffer from the curse dimensionality mainly due to their reliance on meticulously generated spatio-temporal grids. Inspired by modern deep learning based techniques for solving forward and inverse problems associated with Partial Differential Equations, we circumvent the tyranny of numerical discretization by devising an algorithm that is scalable to high-dimensions. In particular, we approximate the unknown solution by a deep neural network which essentially enables us to benefit from the merits of automatic differentiation. To train the aforementioned neural network we leverage the well-known connection between high-dimensional Partial Differential Equations and forward-backward stochastic Differential Equations. In fact, independent realizations of a standard Brownian motion will act as training data. We test the effectiveness of our approach for a couple of benchmark problems spanning a number of scientific domains including Black-Scholes-Barenblatt and Hamilton-Jacobi-Bellman Equations, both in 100-dimensions.

  • hidden physics models machine learning of nonlinear Partial Differential Equations
    Journal of Computational Physics, 2018
    Co-Authors: Maziar Raissi, George Em Karniadakis
    Abstract:

    Abstract While there is currently a lot of enthusiasm about “big data”, useful data is usually “small” and expensive to acquire. In this paper, we present a new paradigm of learning Partial Differential Equations from small data. In particular, we introduce hidden physics models, which are essentially data-efficient learning machines capable of leveraging the underlying laws of physics, expressed by time dependent and nonlinear Partial Differential Equations, to extract patterns from high-dimensional data generated from experiments. The proposed methodology may be applied to the problem of learning, system identification, or data-driven discovery of Partial Differential Equations. Our framework relies on Gaussian processes, a powerful tool for probabilistic inference over functions, that enables us to strike a balance between model complexity and data fitting. The effectiveness of the proposed approach is demonstrated through a variety of canonical problems, spanning a number of scientific domains, including the Navier–Stokes, Schrodinger, Kuramoto–Sivashinsky, and time dependent linear fractional Equations. The methodology provides a promising new direction for harnessing the long-standing developments of classical methods in applied mathematics and mathematical physics to design learning machines with the ability to operate in complex domains without requiring large quantities of data.

  • numerical gaussian processes for time dependent and nonlinear Partial Differential Equations
    SIAM Journal on Scientific Computing, 2018
    Co-Authors: Maziar Raissi, Paris Perdikaris, George Em Karniadakis
    Abstract:

    We introduce the concept of numerical Gaussian processes, which we define as Gaussian processes with covariance functions resulting from temporal discretization of time-dependent Partial Differential Equations. Numerical Gaussian processes, by construction, are designed to deal with cases where (a) all we observe are noisy data on black-box initial conditions, and (b) we are interested in quantifying the uncertainty associated with such noisy data in our solutions to time-dependent Partial Differential Equations. Our method circumvents the need for spatial discretization of the Differential operators by proper placement of Gaussian process priors. This is an attempt to construct structured and data-efficient learning machines, which are explicitly informed by the underlying physics that possibly generated the observed data. The effectiveness of the proposed approach is demonstrated through several benchmark problems involving linear and nonlinear time-dependent operators. In all examples, we are able to r...

  • physics informed deep learning part ii data driven discovery of nonlinear Partial Differential Equations
    arXiv: Artificial Intelligence, 2017
    Co-Authors: Maziar Raissi, Paris Perdikaris, George Em Karniadakis
    Abstract:

    We introduce physics informed neural networks -- neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear Partial Differential Equations. In this second part of our two-part treatise, we focus on the problem of data-driven discovery of Partial Differential Equations. Depending on whether the available data is scattered in space-time or arranged in fixed temporal snapshots, we introduce two main classes of algorithms, namely continuous time and discrete time models. The effectiveness of our approach is demonstrated using a wide range of benchmark problems in mathematical physics, including conservation laws, incompressible fluid flow, and the propagation of nonlinear shallow-water waves.

  • numerical gaussian processes for time dependent and non linear Partial Differential Equations
    arXiv: Machine Learning, 2017
    Co-Authors: Maziar Raissi, Paris Perdikaris, George Em Karniadakis
    Abstract:

    We introduce the concept of numerical Gaussian processes, which we define as Gaussian processes with covariance functions resulting from temporal discretization of time-dependent Partial Differential Equations. Numerical Gaussian processes, by construction, are designed to deal with cases where: (1) all we observe are noisy data on black-box initial conditions, and (2) we are interested in quantifying the uncertainty associated with such noisy data in our solutions to time-dependent Partial Differential Equations. Our method circumvents the need for spatial discretization of the Differential operators by proper placement of Gaussian process priors. This is an attempt to construct structured and data-efficient learning machines, which are explicitly informed by the underlying physics that possibly generated the observed data. The effectiveness of the proposed approach is demonstrated through several benchmark problems involving linear and nonlinear time-dependent operators. In all examples, we are able to recover accurate approximations of the latent solutions, and consistently propagate uncertainty, even in cases involving very long time integration.

Paris Perdikaris - One of the best experts on this subject based on the ideXlab platform.

  • numerical gaussian processes for time dependent and nonlinear Partial Differential Equations
    SIAM Journal on Scientific Computing, 2018
    Co-Authors: Maziar Raissi, Paris Perdikaris, George Em Karniadakis
    Abstract:

    We introduce the concept of numerical Gaussian processes, which we define as Gaussian processes with covariance functions resulting from temporal discretization of time-dependent Partial Differential Equations. Numerical Gaussian processes, by construction, are designed to deal with cases where (a) all we observe are noisy data on black-box initial conditions, and (b) we are interested in quantifying the uncertainty associated with such noisy data in our solutions to time-dependent Partial Differential Equations. Our method circumvents the need for spatial discretization of the Differential operators by proper placement of Gaussian process priors. This is an attempt to construct structured and data-efficient learning machines, which are explicitly informed by the underlying physics that possibly generated the observed data. The effectiveness of the proposed approach is demonstrated through several benchmark problems involving linear and nonlinear time-dependent operators. In all examples, we are able to r...

  • physics informed deep learning part ii data driven discovery of nonlinear Partial Differential Equations
    arXiv: Artificial Intelligence, 2017
    Co-Authors: Maziar Raissi, Paris Perdikaris, George Em Karniadakis
    Abstract:

    We introduce physics informed neural networks -- neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear Partial Differential Equations. In this second part of our two-part treatise, we focus on the problem of data-driven discovery of Partial Differential Equations. Depending on whether the available data is scattered in space-time or arranged in fixed temporal snapshots, we introduce two main classes of algorithms, namely continuous time and discrete time models. The effectiveness of our approach is demonstrated using a wide range of benchmark problems in mathematical physics, including conservation laws, incompressible fluid flow, and the propagation of nonlinear shallow-water waves.

  • numerical gaussian processes for time dependent and non linear Partial Differential Equations
    arXiv: Machine Learning, 2017
    Co-Authors: Maziar Raissi, Paris Perdikaris, George Em Karniadakis
    Abstract:

    We introduce the concept of numerical Gaussian processes, which we define as Gaussian processes with covariance functions resulting from temporal discretization of time-dependent Partial Differential Equations. Numerical Gaussian processes, by construction, are designed to deal with cases where: (1) all we observe are noisy data on black-box initial conditions, and (2) we are interested in quantifying the uncertainty associated with such noisy data in our solutions to time-dependent Partial Differential Equations. Our method circumvents the need for spatial discretization of the Differential operators by proper placement of Gaussian process priors. This is an attempt to construct structured and data-efficient learning machines, which are explicitly informed by the underlying physics that possibly generated the observed data. The effectiveness of the proposed approach is demonstrated through several benchmark problems involving linear and nonlinear time-dependent operators. In all examples, we are able to recover accurate approximations of the latent solutions, and consistently propagate uncertainty, even in cases involving very long time integration.

Pengfei Guan - One of the best experts on this subject based on the ideXlab platform.

Shaher Momani - One of the best experts on this subject based on the ideXlab platform.

  • a novel expansion iterative method for solving linear Partial Differential Equations of fractional order
    Applied Mathematics and Computation, 2015
    Co-Authors: Ahmad Elajou, Shaher Momani, Omar Abu Arqub, Dumitru Baleanu, Ahmed Alsaedi
    Abstract:

    In this manuscript, we implement a relatively new analytic iterative technique for solving time-space-fractional linear Partial Differential Equations subject to given constraints conditions based on the generalized Taylor series formula. The solution methodology is based on generating the multiple fractional power series expansion solution in the form of a rapidly convergent series with minimum size of calculations. This method can be used as an alternative to obtain analytic solutions of different types of fractional linear Partial Differential Equations applied in mathematics, physics, and engineering. Some numerical test applications were analyzed to illustrate the procedure and to confirm the performance of the proposed method in order to show its potentiality, generality, and accuracy for solving such Equations with different constraints conditions. Numerical results coupled with graphical representations explicitly reveal the complete reliability and efficiency of the suggested algorithm.

  • a generalized Differential transform method for linear Partial Differential Equations of fractional order
    Applied Mathematics Letters, 2008
    Co-Authors: Zaid Odibat, Shaher Momani
    Abstract:

    Abstract In this letter we develop a new generalization of the two-dimensional Differential transform method that will extend the application of the method to linear Partial Differential Equations with space- and time-fractional derivatives. The new generalization is based on the two-dimensional Differential transform method, generalized Taylor’s formula and Caputo fractional derivative. Several illustrative examples are given to demonstrate the effectiveness of the present method. The results reveal that the technique introduced here is very effective and convenient for solving linear Partial Differential Equations of fractional order.

  • numerical methods for nonlinear Partial Differential Equations of fractional order
    Applied Mathematical Modelling, 2008
    Co-Authors: Zaid Odibat, Shaher Momani
    Abstract:

    In this article, we implement relatively new analytical techniques, the variational iteration method and the Adomian decomposition method, for solving nonlinear Partial Differential Equations of fractional order. The fractional derivatives are described in the Caputo sense. The two methods in applied mathematics can be used as alternative methods for obtaining analytic and approximate solutions for different types of fractional Differential Equations. In these schemes, the solution takes the form of a convergent series with easily computable components. Numerical results show that the two approaches are easy to implement and accurate when applied to Partial Differential Equations of fractional order.