Universal Approximation

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 15318 Experts worldwide ranked by ideXlab platform

Zarita Zainuddin - One of the best experts on this subject based on the ideXlab platform.

  • The Universal Approximation Capabilities of Cylindrical Approximate Identity Neural Networks
    Arabian Journal for Science and Engineering, 2016
    Co-Authors: Zarita Zainuddin, Saeed Panahian Fard
    Abstract:

    Universal Approximation capability of feedforward neural networks is one of the important theoretical concepts in artificial neural networks. In this study, a type of single-hidden-layer feedforward neural networks is presented. The networks is called feedforward cylindrical approximate identity neural networks. Then, Universal Approximation capabilities of the networks are investigated in two function spaces. Whereby the notions of cylindrical approximate identity and cylindrical convolution are introduced. The analyses are divided into two cases: In the first case, Universal Approximation capability of a single-hidden-layer feedforward cylindrical approximate identity neural networks to continuous bivariate functions on the infinite cylinder is investigated. In the latter case, Universal Approximation capability of the networks is extended to the p th-order Lebesgue integrable bivariate functions on the infinite cylinder.

  • Universal Approximation Property of Weighted Approximate Identity Neural Networks
    Proceedings of The 5th International Conference on Computer Engineering and Networks — PoS(CENet2015), 2015
    Co-Authors: Saeed Fard Panahian, Zarita Zainuddin
    Abstract:

    Universal Approximation property of feedforward artificial neural networks is one of the important research topics in the Approximation theory of artificial neural networks. In this paper, we examine the Universal Approximation property of one-hidden-layer feedforward weighted approximate identity neural networks in the weighed spaces of continuous functions. To do this end, we firstly introduce the notion of weighted approximate identity. Secondly, we present two theorems by using this notion in order to show Universal Approximation property of weighted approximate identity neural networks in the weighed spaces of continuous functions. The obtained results develop the existence theoretical results.

  • the Universal Approximation capabilities of double 2 periodic approximate identity neural networks
    Soft Computing, 2015
    Co-Authors: Saeed Panahian Fard, Zarita Zainuddin
    Abstract:

    The purpose of this study is to investigate the Universal Approximation capabilities of a certain class of single-hidden-layer feedforward neural networks, which is called double 2$$\pi $$?-periodic approximate identity neural networks. Using double 2$$\pi $$?-periodic approximate identity, several theorems concerning the Universal Approximation capabilities of the networks are proved. The proofs of these theorems are sketched based on the double convolution linear operators and the definition of $$\epsilon $$∈-net. The obtained results are divided into two categories. First, the Universal Approximation capability of the networks is shown in the space of continuous bivariate 2$$\pi $$?-periodic functions. Then, Universal Approximation capability of the networks is extended to the space of pth-order Lebesgue-integrable bivariate 2$$\pi $$?-periodic functions. These results can be interpreted as an extension of the Universal Approximation capabilities established for single-hidden-layer feedforward neural networks.

  • The Universal Approximation capabilities of double 2 $$\pi $$ π -periodic a
    Soft Computing, 2015
    Co-Authors: Saeed Panahian Fard, Zarita Zainuddin
    Abstract:

    The purpose of this study is to investigate the Universal Approximation capabilities of a certain class of single-hidden-layer feedforward neural networks, which is called double 2 $$\pi $$ π -periodic approximate identity neural networks. Using double 2 $$\pi $$ π -periodic approximate identity, several theorems concerning the Universal Approximation capabilities of the networks are proved. The proofs of these theorems are sketched based on the double convolution linear operators and the definition of $$\epsilon $$ ϵ -net. The obtained results are divided into two categories. First, the Universal Approximation capability of the networks is shown in the space of continuous bivariate 2 $$\pi $$ π -periodic functions. Then, Universal Approximation capability of the networks is extended to the space of pth-order Lebesgue-integrable bivariate 2 $$\pi $$ π -periodic functions. These results can be interpreted as an extension of the Universal Approximation capabilities established for single-hidden-layer feedforward neural networks.

  • Universal Approximation by Generalized Mellin Approximate Identity Neural Networks
    Proceedings of the 4th International Conference on Computer Engineering and Networks, 2015
    Co-Authors: Saeed Panahian Fard, Zarita Zainuddin
    Abstract:

    This study considers sufficient and also necessary conditions for the Universal Approximation capability of three-layer feedforward generalized Mellin approximate identity neural networks. Our approach consists of three steps. In the first step, we introduce a notion of generalized Mellin approximate identity. In the second step, we prove a theorem by using this notion to show convolution linear operators of generalized Mellin approximate identity with a continuous function f on \( {\mathbb{R}}^{+} \) with a compact support converges uniformly to f. In the third step, we establish a main theorem by using those previous steps. The theorem shows Universal Approximation by generalized Mellin approximate identity neural networks.

Saeed Panahian Fard - One of the best experts on this subject based on the ideXlab platform.

  • The Universal Approximation Capabilities of Cylindrical Approximate Identity Neural Networks
    Arabian Journal for Science and Engineering, 2016
    Co-Authors: Zarita Zainuddin, Saeed Panahian Fard
    Abstract:

    Universal Approximation capability of feedforward neural networks is one of the important theoretical concepts in artificial neural networks. In this study, a type of single-hidden-layer feedforward neural networks is presented. The networks is called feedforward cylindrical approximate identity neural networks. Then, Universal Approximation capabilities of the networks are investigated in two function spaces. Whereby the notions of cylindrical approximate identity and cylindrical convolution are introduced. The analyses are divided into two cases: In the first case, Universal Approximation capability of a single-hidden-layer feedforward cylindrical approximate identity neural networks to continuous bivariate functions on the infinite cylinder is investigated. In the latter case, Universal Approximation capability of the networks is extended to the p th-order Lebesgue integrable bivariate functions on the infinite cylinder.

  • the Universal Approximation capabilities of double 2 periodic approximate identity neural networks
    Soft Computing, 2015
    Co-Authors: Saeed Panahian Fard, Zarita Zainuddin
    Abstract:

    The purpose of this study is to investigate the Universal Approximation capabilities of a certain class of single-hidden-layer feedforward neural networks, which is called double 2$$\pi $$?-periodic approximate identity neural networks. Using double 2$$\pi $$?-periodic approximate identity, several theorems concerning the Universal Approximation capabilities of the networks are proved. The proofs of these theorems are sketched based on the double convolution linear operators and the definition of $$\epsilon $$∈-net. The obtained results are divided into two categories. First, the Universal Approximation capability of the networks is shown in the space of continuous bivariate 2$$\pi $$?-periodic functions. Then, Universal Approximation capability of the networks is extended to the space of pth-order Lebesgue-integrable bivariate 2$$\pi $$?-periodic functions. These results can be interpreted as an extension of the Universal Approximation capabilities established for single-hidden-layer feedforward neural networks.

  • Universal Approximation by Generalized Mellin Approximate Identity Neural Networks
    Proceedings of the 4th International Conference on Computer Engineering and Networks, 2015
    Co-Authors: Saeed Panahian Fard, Zarita Zainuddin
    Abstract:

    This study considers sufficient and also necessary conditions for the Universal Approximation capability of three-layer feedforward generalized Mellin approximate identity neural networks. Our approach consists of three steps. In the first step, we introduce a notion of generalized Mellin approximate identity. In the second step, we prove a theorem by using this notion to show convolution linear operators of generalized Mellin approximate identity with a continuous function f on \( {\mathbb{R}}^{+} \) with a compact support converges uniformly to f. In the third step, we establish a main theorem by using those previous steps. The theorem shows Universal Approximation by generalized Mellin approximate identity neural networks.

  • ICNC - Approximation of multivariate 2π-periodic functions by multiple 2π-periodic approximate identity neural networks based on the Universal Approximation theorems
    2015 11th International Conference on Natural Computation (ICNC), 2015
    Co-Authors: Zarita Zainuddin, Saeed Panahian Fard
    Abstract:

    Universal Approximation capability is an important research topic in artificial neural networks. The purpose of this study is to investigate Universal Approximation capability of a single hidden layer feed forward multiple 2π-periodic approximate identity neural networks in two function spaces. We present the notion of multiple 2π-periodic approximate identity. With respect to this notion, we prove two theorems in the space of continuous multivariate 2π-periodic functions. The second theorem shows that the above networks have Universal Approximation capability. The proof of the theorem uses a technique based on the notion of epsilon-net. Moreover, we discuss the Universal Approximation capability of the networks in the space of Lebesgue integrable multivariate 2π-periodic functions. The results of this study will be able to extend the standard theory of the Universal Approximation capability of feedforward neural networks.

  • Solving Universal Approximation Problem by a Class of Neural Networks based on Hankel Approximate Identity in Function Spaces
    2015
    Co-Authors: Saeed Panahian Fard, Zarita Zainuddin
    Abstract:

    Artificial neural networks has been effectively applied to numerous applications because of their Universal Approximation property. This work is grounded on two frameworks. Firstly, it is concerned with solving Universal Approximation problem by a class of neural networks based on Hankel approximate identity which is embedded in the space of continuous functions on real positive numbers. Secondly, this problem solving will be investigated in the Lebesgue spaces on real positive numbers. The methods are constructed on the notions of Hankel convolution linear operators, Hankel approximate identity, and epsilon-net.

Vladik Kreinovich - One of the best experts on this subject based on the ideXlab platform.

  • Universal Approximation with uninorm-based fuzzy neural networks
    2011 Annual Meeting of the North American Fuzzy Information Processing Society, 2011
    Co-Authors: André Paim Lemos, Vladik Kreinovich, Walmir M. Caminhas, Fernando Gomide
    Abstract:

    Fuzzy neural networks are hybrid models capable to approximate functions with high precision and to generate transparent models, enabling the extraction of valuable information from the resulting topology. In this paper we will show that the recently proposed fuzzy neural network based on weighted uninorms aggregations uniformly approximates any real functions on any compact set. We will describe the network topology and inference mechanism and show that the Universal Approximation property of this network is valid for a given choice of operators.

  • Universal Approximation theorem for uninorm based fuzzy systems modeling
    Fuzzy Sets and Systems, 2003
    Co-Authors: Ronald R. Yager, Vladik Kreinovich
    Abstract:

    Abstract Most existing Universal Approximation results for fuzzy systems are based on the assumption that we use t-norms and t-conorms to represent “and” and “or.” Yager has proposed to use, within the fuzzy systems modeling paradigm, more general operations based on uninorms. In this paper, we show that the Universal Approximation property holds for an arbitrary choice of a uninorm.

Changchun Zhu - One of the best experts on this subject based on the ideXlab platform.

  • wavelet support vector machine with Universal Approximation and its application
    Information Theory Workshop, 2006
    Co-Authors: Wenhui Chen, Wanzhao Cui, Changchun Zhu
    Abstract:

    Wavelet support vector machines (WSVM) using the Mexican Hat wavelet kernel has been used for nonlinear system identification successfully, but its Universal Approximation property has never been proved in theory. Based on the Stone-Weierstrass theorem, the Universal Approximation property of the WSVM to arbitrary functions on a compact set is proved with arbitrary accuracy. These simulations show that WSVM is very effective in nonlinear system identification, and can deduce the noise of the system, so WSVM has great potential applications in function estimation, nonlinear system identification, signal processing and control

Barbara Hammer - One of the best experts on this subject based on the ideXlab platform.

  • Universal Approximation Capability of Cascade Correlation for Structures
    Neural Computation, 2005
    Co-Authors: Barbara Hammer, Alessio Micheli, Alessandro Sperduti
    Abstract:

    Cascade correlation (CC) constitutes a training method for neural networks that determines the weights as well as the neural architecture during training. Various extensions of CC to structured data have been proposed: recurrent cascade correlation (RCC) for sequences, recursive cascade correlation (RecCC) for tree structures with limited fan-out, and contextual recursive cascade correlation (CRecCC) for rooted directed positional acyclic graphs (DPAGs) with limited fan-in and fan-out. We show that these models possess the Universal Approximation property in the following sense: given a probability measure P on the input set, every measurable function from sequences into a real vector space can be approximated by a sigmoidal RCC up to any desired degree of accuracy up to inputs of arbitrary small probability. Every measurable function from tree structures with limited fan-out into a real vector space can be approximated by a sigmoidal RecCC with multiplicative neurons up to any desired degree of accuracy up to inputs of arbitrary small probability. For sigmoidal CRecCC networks with multiplicative neurons, we show the Universal Approximation capability for functions on an important subset of all DPAGs with limited fan-in and fan-out for which a specific linear representation yields unique codes. We give one sufficient structural condition for the latter property, which can easily be tested: the enumeration of ingoing and outgoing edges should becom patible. This property can be fulfilled for every DPAG with fan-in and fan-out two via reenumeration of children and parents, and for larger fan-in and fan-out via an expansion of the fan-in and fan-out and reenumeration of children and parents. In addition, the result can be generalized to the case of input-output isomorphic transductions of structures. Thus, CRecCC networks consti-tute the first neural models for which the Universal Approximation ca-pability of functions involving fairly general acyclic graph structures is proved.

  • a note on the Universal Approximation capability of support vector machines
    Neural Processing Letters, 2003
    Co-Authors: Barbara Hammer, Kai Gersmann
    Abstract:

    The Approximation capability of support vector machines (SVMs) is investigated. We show the Universal Approximation capability of SVMs with various kernels, including Gaussian, several dot product, or polynomial kernels, based on the Universal Approximation capability of their standard feedforward neural network counterparts. Moreover, it is shown that an SVM with polynomial kernel of degree p − 1 which is trained on a training set of size p can approximate the p training points up to any accuracy.