Curse of Dimensionality

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 19920 Experts worldwide ranked by ideXlab platform

Arnulf Jentzen - One of the best experts on this subject based on the ideXlab platform.

  • overcoming the Curse of Dimensionality in the numerical approximation of high dimensional semilinear elliptic partial differential equations
    arXiv: Probability, 2020
    Co-Authors: Christian Beck, Lukas Gonon, Arnulf Jentzen
    Abstract:

    Recently, so-called full-history recursive multilevel Picard (MLP) approximation schemes have been introduced and shown to overcome the Curse of Dimensionality in the numerical approximation of semilinear parabolic partial differential equations (PDEs) with Lipschitz nonlinearities. The key contribution of this article is to introduce and analyze a new variant of MLP approximation schemes for certain semilinear elliptic PDEs with Lipschitz nonlinearities and to prove that the proposed approximation schemes overcome the Curse of Dimensionality in the numerical approximation of such semilinear elliptic PDEs.

  • overcoming the Curse of Dimensionality in the approximative pricing of financial derivatives with default risks
    Electronic Journal of Probability, 2020
    Co-Authors: Martin Hutzenthaler, Arnulf Jentzen, Von Wurstemberger Wurstemberger
    Abstract:

    Parabolic partial differential equations (PDEs) are widely used in the mathematical modeling of natural phenomena and man-made complex systems. In particular, parabolic PDEs are a fundamental tool to approximately determine fair prices of financial derivatives in the financial engineering industry. The PDEs appearing in financial engineering applications are often nonlinear (e.g., in PDE models which take into account the possibility of a defaulting counterparty) and high-dimensional since the dimension typically corresponds to the number of considered financial assets. A major issue in the scientific literature is that most approximation methods for nonlinear PDEs suffer from the so-called Curse of Dimensionality in the sense that the computational effort to compute an approximation with a prescribed accuracy grows exponentially in the dimension of the PDE or in the reciprocal of the prescribed approximation accuracy and nearly all approximation methods for nonlinear PDEs in the scientific literature have not been shown not to suffer from the Curse of Dimensionality. Recently, a new class of approximation schemes for semilinear parabolic PDEs, termed full history recursive multilevel Picard (MLP) algorithms, were introduced and it was proven that MLP algorithms do overcome the Curse of Dimensionality for semilinear heat equations. In this paper we extend and generalize those findings to a more general class of semilinear PDEs which includes as special cases the important examples of semilinear Black-Scholes equations used in pricing models for financial derivatives with default risks. In particular, we introduce an MLP algorithm for the approximation of solutions of semilinear Black-Scholes equations and prove, under the assumption that the nonlinearity in the PDE is globally Lipschitz continuous, that the computational effort of the proposed method grows at most polynomially in both the dimension and the reciprocal of the prescribed approximation accuracy. We thereby establish, for the first time, that the numerical approximation of solutions of semilinear Black-Scholes equations is a polynomially tractable approximation problem.

  • overcoming the Curse of Dimensionality in the numerical approximation of parabolic partial differential equations with gradient dependent nonlinearities
    SAM Research Report, 2019
    Co-Authors: Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse
    Abstract:

    Partial differential equations (PDEs) are a fundamental tool in the modeling of many real-world phenomena. In a number of such real-world phenomena the PDEs under consideration contain gradient-dependent nonlinearities and are high-dimensional. Such high-dimensional nonlinear PDEs can in nearly all cases not be solved explicitly, and it is one of the most challenging tasks in applied mathematics to solve high-dimensional nonlinear PDEs approximately. It is especially very challenging to design approximation algorithms for nonlinear PDEs for which one can rigorously prove that they do overcome the so-called Curse of Dimensionality in the sense that the number of computational operations of the approximation algorithm needed to achieve an approximation precision of size $${\varepsilon }> 0$$ grows at most polynomially in both the PDE dimension $$d \in \mathbb {N}$$ and the reciprocal of the prescribed approximation accuracy $${\varepsilon }$$ . In particular, to the best of our knowledge there exists no approximation algorithm in the scientific literature which has been proven to overcome the Curse of Dimensionality in the case of a class of nonlinear PDEs with general time horizons and gradient-dependent nonlinearities. It is the key contribution of this article to overcome this difficulty. More specifically, it is the key contribution of this article (i) to propose a new full-history recursive multilevel Picard approximation algorithm for high-dimensional nonlinear heat equations with general time horizons and gradient-dependent nonlinearities and (ii) to rigorously prove that this full-history recursive multilevel Picard approximation algorithm does indeed overcome the Curse of Dimensionality in the case of such nonlinear heat equations with gradient-dependent nonlinearities.

  • overcoming the Curse of Dimensionality in the approximative pricing of financial derivatives with default risks
    SAM Research Report, 2019
    Co-Authors: Martin Hutzenthaler, Arnulf Jentzen, Philippe Von Wurstemberger
    Abstract:

    Parabolic partial differential equations (PDEs) are widely used in the mathematical modeling of natural phenomena and man made complex systems. In particular, parabolic PDEs are a fundamental tool to determine fair prices of financial derivatives in the financial industry. The PDEs appearing in financial engineering applications are often nonlinear and high dimensional since the dimension typically corresponds to the number of considered financial assets. A major issue is that most approximation methods for nonlinear PDEs in the literature suffer under the so-called Curse of Dimensionality in the sense that the computational effort to compute an approximation with a prescribed accuracy grows exponentially in the dimension of the PDE or in the reciprocal of the prescribed approximation accuracy and nearly all approximation methods have not been shown not to suffer under the Curse of Dimensionality. Recently, a new class of approximation schemes for semilinear parabolic PDEs, termed full history recursive multilevel Picard (MLP) algorithms, were introduced and it was proven that MLP algorithms do overcome the Curse of Dimensionality for semilinear heat equations. In this paper we extend those findings to a more general class of semilinear PDEs including as special cases semilinear Black-Scholes equations used for the pricing of financial derivatives with default risks. More specifically, we introduce an MLP algorithm for the approximation of solutions of semilinear Black-Scholes equations and prove that the computational effort of our method grows at most polynomially both in the dimension and the reciprocal of the prescribed approximation accuracy. This is, to the best of our knowledge, the first result showing that the approximation of solutions of semilinear Black-Scholes equations is a polynomially tractable approximation problem.

  • a proof that deep artificial neural networks overcome the Curse of Dimensionality in the numerical approximation of kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients
    arXiv: Numerical Analysis, 2018
    Co-Authors: Arnulf Jentzen, Diyora Salimova, Timo Welti
    Abstract:

    In recent years deep artificial neural networks (DNNs) have been successfully employed in numerical simulations for a multitude of computational problems including, for example, object and face recognition, natural language processing, fraud detection, computational advertisement, and numerical approximations of partial differential equations (PDEs). These numerical simulations indicate that DNNs seem to possess the fundamental flexibility to overcome the Curse of Dimensionality in the sense that the number of real parameters used to describe the DNN grows at most polynomially in both the reciprocal of the prescribed approximation accuracy $ \varepsilon > 0 $ and the dimension $ d \in \mathbb{N}$ of the function which the DNN aims to approximate in such computational problems. There is also a large number of rigorous mathematical approximation results for artificial neural networks in the scientific literature but there are only a few special situations where results in the literature can rigorously justify the success of DNNs in high-dimensional function approximation. The key contribution of this paper is to reveal that DNNs do overcome the Curse of Dimensionality in the numerical approximation of Kolmogorov PDEs with constant diffusion and nonlinear drift coefficients. We prove that the number of parameters used to describe the employed DNN grows at most polynomially in both the PDE dimension $ d \in \mathbb{N}$ and the reciprocal of the prescribed approximation accuracy $ \varepsilon > 0 $. A crucial ingredient in our proof is the fact that the artificial neural network used to approximate the solution of the PDE is indeed a deep artificial neural network with a large number of hidden layers.

Martin Hutzenthaler - One of the best experts on this subject based on the ideXlab platform.

  • overcoming the Curse of Dimensionality in the approximative pricing of financial derivatives with default risks
    Electronic Journal of Probability, 2020
    Co-Authors: Martin Hutzenthaler, Arnulf Jentzen, Von Wurstemberger Wurstemberger
    Abstract:

    Parabolic partial differential equations (PDEs) are widely used in the mathematical modeling of natural phenomena and man-made complex systems. In particular, parabolic PDEs are a fundamental tool to approximately determine fair prices of financial derivatives in the financial engineering industry. The PDEs appearing in financial engineering applications are often nonlinear (e.g., in PDE models which take into account the possibility of a defaulting counterparty) and high-dimensional since the dimension typically corresponds to the number of considered financial assets. A major issue in the scientific literature is that most approximation methods for nonlinear PDEs suffer from the so-called Curse of Dimensionality in the sense that the computational effort to compute an approximation with a prescribed accuracy grows exponentially in the dimension of the PDE or in the reciprocal of the prescribed approximation accuracy and nearly all approximation methods for nonlinear PDEs in the scientific literature have not been shown not to suffer from the Curse of Dimensionality. Recently, a new class of approximation schemes for semilinear parabolic PDEs, termed full history recursive multilevel Picard (MLP) algorithms, were introduced and it was proven that MLP algorithms do overcome the Curse of Dimensionality for semilinear heat equations. In this paper we extend and generalize those findings to a more general class of semilinear PDEs which includes as special cases the important examples of semilinear Black-Scholes equations used in pricing models for financial derivatives with default risks. In particular, we introduce an MLP algorithm for the approximation of solutions of semilinear Black-Scholes equations and prove, under the assumption that the nonlinearity in the PDE is globally Lipschitz continuous, that the computational effort of the proposed method grows at most polynomially in both the dimension and the reciprocal of the prescribed approximation accuracy. We thereby establish, for the first time, that the numerical approximation of solutions of semilinear Black-Scholes equations is a polynomially tractable approximation problem.

  • overcoming the Curse of Dimensionality in the numerical approximation of parabolic partial differential equations with gradient dependent nonlinearities
    SAM Research Report, 2019
    Co-Authors: Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse
    Abstract:

    Partial differential equations (PDEs) are a fundamental tool in the modeling of many real-world phenomena. In a number of such real-world phenomena the PDEs under consideration contain gradient-dependent nonlinearities and are high-dimensional. Such high-dimensional nonlinear PDEs can in nearly all cases not be solved explicitly, and it is one of the most challenging tasks in applied mathematics to solve high-dimensional nonlinear PDEs approximately. It is especially very challenging to design approximation algorithms for nonlinear PDEs for which one can rigorously prove that they do overcome the so-called Curse of Dimensionality in the sense that the number of computational operations of the approximation algorithm needed to achieve an approximation precision of size $${\varepsilon }> 0$$ grows at most polynomially in both the PDE dimension $$d \in \mathbb {N}$$ and the reciprocal of the prescribed approximation accuracy $${\varepsilon }$$ . In particular, to the best of our knowledge there exists no approximation algorithm in the scientific literature which has been proven to overcome the Curse of Dimensionality in the case of a class of nonlinear PDEs with general time horizons and gradient-dependent nonlinearities. It is the key contribution of this article to overcome this difficulty. More specifically, it is the key contribution of this article (i) to propose a new full-history recursive multilevel Picard approximation algorithm for high-dimensional nonlinear heat equations with general time horizons and gradient-dependent nonlinearities and (ii) to rigorously prove that this full-history recursive multilevel Picard approximation algorithm does indeed overcome the Curse of Dimensionality in the case of such nonlinear heat equations with gradient-dependent nonlinearities.

  • overcoming the Curse of Dimensionality in the approximative pricing of financial derivatives with default risks
    SAM Research Report, 2019
    Co-Authors: Martin Hutzenthaler, Arnulf Jentzen, Philippe Von Wurstemberger
    Abstract:

    Parabolic partial differential equations (PDEs) are widely used in the mathematical modeling of natural phenomena and man made complex systems. In particular, parabolic PDEs are a fundamental tool to determine fair prices of financial derivatives in the financial industry. The PDEs appearing in financial engineering applications are often nonlinear and high dimensional since the dimension typically corresponds to the number of considered financial assets. A major issue is that most approximation methods for nonlinear PDEs in the literature suffer under the so-called Curse of Dimensionality in the sense that the computational effort to compute an approximation with a prescribed accuracy grows exponentially in the dimension of the PDE or in the reciprocal of the prescribed approximation accuracy and nearly all approximation methods have not been shown not to suffer under the Curse of Dimensionality. Recently, a new class of approximation schemes for semilinear parabolic PDEs, termed full history recursive multilevel Picard (MLP) algorithms, were introduced and it was proven that MLP algorithms do overcome the Curse of Dimensionality for semilinear heat equations. In this paper we extend those findings to a more general class of semilinear PDEs including as special cases semilinear Black-Scholes equations used for the pricing of financial derivatives with default risks. More specifically, we introduce an MLP algorithm for the approximation of solutions of semilinear Black-Scholes equations and prove that the computational effort of our method grows at most polynomially both in the dimension and the reciprocal of the prescribed approximation accuracy. This is, to the best of our knowledge, the first result showing that the approximation of solutions of semilinear Black-Scholes equations is a polynomially tractable approximation problem.

  • a proof that rectified deep neural networks overcome the Curse of Dimensionality in the numerical approximation of semilinear heat equations
    arXiv: Numerical Analysis, 2019
    Co-Authors: Martin Hutzenthaler, Thomas Kruse, Tuan-anh Nguyen
    Abstract:

    Deep neural networks and other deep learning methods have very successfully been applied to the numerical approximation of high-dimensional nonlinear parabolic partial differential equations (PDEs), which are widely used in finance, engineering, and natural sciences. In particular, simulations indicate that algorithms based on deep learning overcome the Curse of Dimensionality in the numerical approximation of solutions of semilinear PDEs. For certain linear PDEs this has also been proved mathematically. The key contribution of this article is to rigorously prove this for the first time for a class of nonlinear PDEs. More precisely, we prove in the case of semilinear heat equations with gradient-independent nonlinearities that the numbers of parameters of the employed deep neural networks grow at most polynomially in both the PDE dimension and the reciprocal of the prescribed approximation accuracy. Our proof relies on recently introduced multilevel Picard approximations of semilinear PDEs.

  • overcoming the Curse of Dimensionality in the numerical approximation of semilinear parabolic partial differential equations
    arXiv: Probability, 2018
    Co-Authors: Martin Hutzenthaler, Thomas Kruse, Tuan-anh Nguyen, Arnulf Jentzen, Philippe Von Wurstemberger
    Abstract:

    For a long time it is well-known that high-dimensional linear parabolic partial differential equations (PDEs) can be approximated by Monte Carlo methods with a computational effort which grows polynomially both in the dimension and in the reciprocal of the prescribed accuracy. In other words, linear PDEs do not suffer from the Curse of Dimensionality. For general semilinear PDEs with Lipschitz coefficients, however, it remained an open question whether these suffer from the Curse of Dimensionality. In this paper we partially solve this open problem. More precisely, we prove in the case of semilinear heat equations with gradient-independent and globally Lipschitz continuous nonlinearities that the computational effort of a variant of the recently introduced multilevel Picard approximations grows polynomially both in the dimension and in the reciprocal of the required accuracy.

Francis Bach - One of the best experts on this subject based on the ideXlab platform.

  • breaking the Curse of Dimensionality with convex neural networks
    Journal of Machine Learning Research, 2017
    Co-Authors: Francis Bach
    Abstract:

    We consider neural networks with a single hidden layer and non-decreasing positively homogeneous activation functions like the rectified linear units. By letting the number of hidden units grow unbounded and using classical non-Euclidean regularization tools on the output weights, they lead to a convex optimization problem and we provide a detailed theoretical analysis of their generalization performance, with a study of both the approximation and the estimation errors. We show in particular that they are adaptive to unknown underlying linear structures, such as the dependence on the projection of the input variables onto a low-dimensional subspace. Moreover, when using sparsity-inducing norms on the input weights, we show that high-dimensional non-linear variable selection may be achieved, without any strong assumption regarding the data and with a total number of variables potentially exponential in the number of observations. However, solving this convex optimization problem in infinite dimensions is only possible if the nonconvex subproblem of addition of a new unit can be solved efficiently. We provide a simple geometric interpretation for our choice of activation functions and describe simple conditions for convex relaxations of the finite-dimensional non-convex subproblem to achieve the same generalization error bounds, even when constant-factor approximations cannot be found. We were not able to find strong enough convex relaxations to obtain provably polynomialtime algorithms and leave open the existence or non-existence of such tractable algorithms with non-exponential sample complexities.

  • breaking the Curse of Dimensionality with convex neural networks
    arXiv: Learning, 2014
    Co-Authors: Francis Bach
    Abstract:

    We consider neural networks with a single hidden layer and non-decreasing homogeneous activa-tion functions like the rectified linear units. By letting the number of hidden units grow unbounded and using classical non-Euclidean regularization tools on the output weights, we provide a detailed theoretical analysis of their generalization performance, with a study of both the approximation and the estimation errors. We show in particular that they are adaptive to unknown underlying linear structures, such as the dependence on the projection of the input variables onto a low-dimensional subspace. Moreover, when using sparsity-inducing norms on the input weights, we show that high-dimensional non-linear variable selection may be achieved, without any strong assumption regarding the data and with a total number of variables potentially exponential in the number of ob-servations. In addition, we provide a simple geometric interpretation to the non-convex problem of addition of a new unit, which is the core potentially hard computational element in the framework of learning from continuously many basis functions. We provide simple conditions for convex relaxations to achieve the same generalization error bounds, even when constant-factor approxi-mations cannot be found (e.g., because it is NP-hard such as for the zero-homogeneous activation function). We were not able to find strong enough convex relaxations and leave open the existence or non-existence of polynomial-time algorithms.

Jentzen Arnulf - One of the best experts on this subject based on the ideXlab platform.

  • Lower bounds for artificial neural network approximations: A proof that shallow neural networks fail to overcome the Curse of Dimensionality
    2021
    Co-Authors: Grohs Philipp, Jentzen Arnulf, Ibragimov Shokhrukh, Koppensteiner Sarah
    Abstract:

    Artificial neural networks (ANNs) have become a very powerful tool in the approximation of high-dimensional functions. Especially, deep ANNs, consisting of a large number of hidden layers, have been very successfully used in a series of practical relevant computational problems involving high-dimensional input data ranging from classification tasks in supervised learning to optimal decision problems in reinforcement learning. There are also a number of mathematical results in the scientific literature which study the approximation capacities of ANNs in the context of high-dimensional target functions. In particular, there are a series of mathematical results in the scientific literature which show that sufficiently deep ANNs have the capacity to overcome the Curse of Dimensionality in the approximation of certain target function classes in the sense that the number of parameters of the approximating ANNs grows at most polynomially in the dimension $d \in \mathbb{N}$ of the target functions under considerations. In the proofs of several of such high-dimensional approximation results it is crucial that the involved ANNs are sufficiently deep and consist a sufficiently large number of hidden layers which grows in the dimension of the considered target functions. It is the topic of this work to look a bit more detailed to the deepness of the involved ANNs in the approximation of high-dimensional target functions. In particular, the main result of this work proves that there exists a concretely specified sequence of functions which can be approximated without the Curse of Dimensionality by sufficiently deep ANNs but which cannot be approximated without the Curse of Dimensionality if the involved ANNs are shallow or not deep enough.Comment: 53 page

  • Multilevel Picard approximations for high-dimensional semilinear second-order PDEs with Lipschitz nonlinearities
    2020
    Co-Authors: Hutzenthaler Martin, Jentzen Arnulf, Kruse Thomas, Nguyen, Tuan Anh
    Abstract:

    The recently introduced full-history recursive multilevel Picard (MLP) approximation methods have turned out to be quite successful in the numerical approximation of solutions of high-dimensional nonlinear PDEs. In particular, there are mathematical convergence results in the literature which prove that MLP approximation methods do overcome the Curse of Dimensionality in the numerical approximation of nonlinear second-order PDEs in the sense that the number of computational operations of the proposed MLP approximation method grows at most polynomially in both the reciprocal $1/\epsilon$ of the prescribed approximation accuracy $\epsilon>0$ and the PDE dimension $d\in \mathbb{N}=\{1,2,3, \ldots\}$. However, in each of the convergence results for MLP approximation methods in the literature it is assumed that the coefficient functions in front of the second-order differential operator are affine linear. In particular, until today there is no result in the scientific literature which proves that any semilinear second-order PDE with a general time horizon and a non affine linear coefficient function in front of the second-order differential operator can be approximated without the Curse of Dimensionality. It is the key contribution of this article to overcome this obstacle and to propose and analyze a new type of MLP approximation method for semilinear second-order PDEs with possibly nonlinear coefficient functions in front of the second-order differential operators. In particular, the main result of this article proves that this new MLP approximation method does indeed overcome the Curse of Dimensionality in the numerical approximation of semilinear second-order PDEs

  • Overcoming the Curse of Dimensionality in the numerical approximation of high-dimensional semilinear elliptic partial differential equations
    2020
    Co-Authors: Beck Christian, Gonon Lukas, Jentzen Arnulf
    Abstract:

    Recently, so-called full-history recursive multilevel Picard (MLP) approximation schemes have been introduced and shown to overcome the Curse of Dimensionality in the numerical approximation of semilinear parabolic partial differential equations (PDEs) with Lipschitz nonlinearities. The key contribution of this article is to introduce and analyze a new variant of MLP approximation schemes for certain semilinear elliptic PDEs with Lipschitz nonlinearities and to prove that the proposed approximation schemes overcome the Curse of Dimensionality in the numerical approximation of such semilinear elliptic PDEs.Comment: 50 page

  • On nonlinear Feynman-Kac formulas for viscosity solutions of semilinear parabolic partial differential equations
    2020
    Co-Authors: Beck Christian, Hutzenthaler Martin, Jentzen Arnulf
    Abstract:

    The classical Feynman-Kac identity builds a bridge between stochastic analysis and partial differential equations (PDEs) by providing stochastic representations for classical solutions of linear Kolmogorov PDEs. This opens the door for the derivation of sampling based Monte Carlo approximation methods, which can be meshfree and thereby stand a chance to approximate solutions of PDEs without suffering from the Curse of Dimensionality. In this article we extend the classical Feynman-Kac formula to certain semilinear Kolmogorov PDEs. More specifically, we identify suitable solutions of stochastic fixed point equations (SFPEs), which arise when the classical Feynman-Kac identity is formally applied to semilinear Kolmorogov PDEs, as viscosity solutions of the corresponding PDEs. This justifies, in particular, employing full-history recursive multilevel Picard (MLP) approximation algorithms, which have recently been shown to overcome the Curse of Dimensionality in the numerical approximation of solutions of SFPEs, in the numerical approximation of semilinear Kolmogorov PDEs.Comment: 54 page

Philippe Von Wurstemberger - One of the best experts on this subject based on the ideXlab platform.

  • a proof that artificial neural networks overcome the Curse of Dimensionality in the numerical approximation of black scholes partial differential equations
    The annual research report, 2018
    Co-Authors: Philipp Grohs, Arnulf Jentzen, Fabian Hornung, Philippe Von Wurstemberger
    Abstract:

    Artificial neural networks (ANNs) have very successfully been used in numerical simulations for a series of computational problems ranging from image classification/image recognition, speech recognition, time series analysis, game intelligence, and computational advertising to numerical approximations of partial differential equations (PDEs). Such numerical simulations suggest that ANNs have the capacity to very efficiently approximate high-dimensional functions and, especially, such numerical simulations indicate that ANNs seem to admit the fundamental power to overcome the Curse of Dimensionality when approximating the high-dimensional functions appearing in the above named computational problems. There are also a series of rigorous mathematical approximation results for ANNs in the scientific literature. Some of these mathematical results prove convergence without convergence rates and some of these mathematical results even rigorously establish convergence rates but there are only a few special cases where mathematical results can rigorously explain the empirical success of ANNs when approximating high-dimensional functions. The key contribution of this article is to disclose that ANNs can efficiently approximate high-dimensional functions in the case of numerical approximations of Black-Scholes PDEs. More precisely, this work reveals that the number of required parameters of an ANN to approximate the solution of the Black-Scholes PDE grows at most polynomially in both the reciprocal of the prescribed approximation accuracy $\varepsilon > 0$ and the PDE dimension $d \in \mathbb{N}$ and we thereby prove, for the first time, that ANNs do indeed overcome the Curse of Dimensionality in the numerical approximation of Black-Scholes PDEs.

  • overcoming the Curse of Dimensionality in the numerical approximation of semilinear parabolic partial differential equations
    arXiv: Probability, 2018
    Co-Authors: Martin Hutzenthaler, Thomas Kruse, Tuan-anh Nguyen, Arnulf Jentzen, Philippe Von Wurstemberger
    Abstract:

    For a long time it is well-known that high-dimensional linear parabolic partial differential equations (PDEs) can be approximated by Monte Carlo methods with a computational effort which grows polynomially both in the dimension and in the reciprocal of the prescribed accuracy. In other words, linear PDEs do not suffer from the Curse of Dimensionality. For general semilinear PDEs with Lipschitz coefficients, however, it remained an open question whether these suffer from the Curse of Dimensionality. In this paper we partially solve this open problem. More precisely, we prove in the case of semilinear heat equations with gradient-independent and globally Lipschitz continuous nonlinearities that the computational effort of a variant of the recently introduced multilevel Picard approximations grows polynomially both in the dimension and in the reciprocal of the required accuracy.