Approximation Theorem - Explore the Science & Experts | ideXlab

Scan Science and Technology

Contact Leading Edge Experts & Companies

Approximation Theorem

The Experts below are selected from a list of 297 Experts worldwide ranked by ideXlab platform

Guillaume Rond – 1st expert on this subject based on the ideXlab platform

  • linear nested artin Approximation Theorem for algebraic power series
    Manuscripta Mathematica, 2019
    Co-Authors: F J Castrojimenez, Dorin Popescu, Guillaume Rond

    Abstract:

    We give an elementary proof of the nested Artin Approximation Theorem for linear equations with algebraic power series coefficients. Moreover, for any Noetherian local subring of the ring of formal power series, we clarify the relationship between this Theorem and the problem of the commutation of two operations for ideals: the operation of replacing an ideal by its completion and the operation of replacing an ideal by one of its elimination ideals. In particular we prove that a Grothendieck conjecture about morphisms of analytic/formal algebras and Artin’s question about linear nested Approximation problem are equivalent.

  • linear nested artin Approximation Theorem for algebraic power series
    arXiv: Commutative Algebra, 2015
    Co-Authors: F J Castrojimenez, Dorin Popescu, Guillaume Rond

    Abstract:

    We give a new and elementary proof of the nested Artin Approximation Theorem for linear equations with algebraic power series coefficients. Moreover, for any Noetherian local subring of the ring of formal power series, we clarify the relationship between this Theorem and the problem of the com-mutation of two operations for ideals: the operation of replacing an ideal by its completion and the operation of replacing an ideal by one of its elimination ideals.

Mohammad Mursaleen – 2nd expert on this subject based on the ideXlab platform

  • Korovkin Type Approximation Theorem for Almost and Statistical Convergence
    Springer Optimization and Its Applications, 2020
    Co-Authors: Mohammad Mursaleen, Syed Abdul Mohiuddine

    Abstract:

    In this paper, we use the notion of almost convergence and statistical convergence to prove the Korovkin type Approximation Theorem by using the test functions 1,e −x ,e −2x . We also display an interesting example in support of our results.

  • statistical summability c 1 and a korovkin type Approximation Theorem
    Journal of Inequalities and Applications, 2012
    Co-Authors: Syed Abdul Mohiuddine, Abdullah Alotaibi, Mohammad Mursaleen

    Abstract:

    The concept of statistical summability (C, 1) has recently been introduced by Moricz [Jour. Math. Anal. Appl. 275, 277-287 (2002)]. In this paper, we use this notion of summability to prove the Korovkin type Approximation Theorem by using the test functions 1, e –x , e –2x . We also give here the rate of statistical summability (C ,1 ) and apply the classical Baskakov operator to construct an example in support of our main result. MSC: 41A10; 41A25; 41A36; 40A30; 40G15

  • statistical a summability of double sequences and a korovkin type Approximation Theorem
    Bulletin of The Korean Mathematical Society, 2012
    Co-Authors: Cemal Belen, Mohammad Mursaleen, Mustafa Yildirim

    Abstract:

    In this paper, we dene the notion of statistical A-summabil- ity for double sequences and nd its relation with A-statistical conver- gence. We apply our new method of summability to prove a Korovkin- type Approximation Theorem for a function of two variables. Furthermore, through an example, it is shown that our Theorem is stronger than clas- sical and statistical cases.

Geoffrey J Mclachlan – 3rd expert on this subject based on the ideXlab platform

  • a universal Approximation Theorem for mixture of experts models
    arXiv: Machine Learning, 2016
    Co-Authors: Hien D Nguyen, Luke R Lloydjones, Geoffrey J Mclachlan

    Abstract:

    The mixture of experts (MoE) model is a popular neural network architecture for nonlinear regression and classification. The class of MoE mean functions is known to be uniformly convergent to any unknown target function, assuming that the target function is from Sobolev space that is sufficiently differentiable and that the domain of estimation is a compact unit hypercube. We provide an alternative result, which shows that the class of MoE mean functions is dense in the class of all continuous functions over arbitrary compact domains of estimation. Our result can be viewed as a universal Approximation Theorem for MoE models.

  • A Universal Approximation Theorem for Mixture-of-Experts Models
    Neural Computation, 2016
    Co-Authors: Hien D Nguyen, Luke R. Lloyd-jones, Geoffrey J Mclachlan

    Abstract:

    The mixture-of-experts (MoE) model is a popular neural network architecture for nonlinear regression and classification. The class of MoE mean functions is known to be uniformly convergent to any unknown target function, assuming that the target function is from a Sobolev space that is sufficiently differentiable and that the domain of estimation is a compact unit hypercube. We provide an alternative result, which shows that the class of MoE mean functions is dense in the class of all continuous functions over arbitrary compact domains of estimation. Our result can be viewed as a universal Approximation Theorem for MoE models. The Theorem we present allows MoE users to be confident in applying such models for estimation when data arise from nonlinear and nondifferentiable generative processes.