Orthogonal Projection Operator

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 1728 Experts worldwide ranked by ideXlab platform

Kenneth Butts - One of the best experts on this subject based on the ideXlab platform.

  • Multiscale Support Vector Learning With Projection Operator Wavelet Kernel for Nonlinear Dynamical System Identification
    IEEE transactions on neural networks and learning systems, 2016
    Co-Authors: Jing Sun, Kenneth Butts
    Abstract:

    A giant leap has been made in the past couple of decades with the introduction of kernel-based learning as a mainstay for designing effective nonlinear computational learning algorithms. In view of the geometric interpretation of conditional expectation and the ubiquity of multiscale characteristics in highly complex nonlinear dynamic systems [1] – [3] , this paper presents a new Orthogonal Projection Operator wavelet kernel, aiming at developing an efficient computational learning approach for nonlinear dynamical system identification. In the framework of multiresolution analysis, the proposed Projection Operator wavelet kernel can fulfill the multiscale, multidimensional learning to estimate complex dependencies. The special advantage of the Projection Operator wavelet kernel developed in this paper lies in the fact that it has a closed-form expression, which greatly facilitates its application in kernel learning. To the best of our knowledge, it is the first closed-form Orthogonal Projection wavelet kernel reported in the literature. It provides a link between grid-based wavelets and mesh-free kernel-based methods. Simulation studies for identifying the parallel models of two benchmark nonlinear dynamical systems confirm its superiority in model accuracy and sparsity.

Wen Yan - One of the best experts on this subject based on the ideXlab platform.

  • IJCNN - Closed-form Projection Operator wavelet kernels in support vector learning for nonlinear dynamical systems identification
    The 2013 International Joint Conference on Neural Networks (IJCNN), 2013
    Co-Authors: Wen Yan
    Abstract:

    As a special idempotent Operator, the Projection Operator plays a crucial role in the Spectral Decomposition Theorem for linear Operators in Hilbert space. In this paper, an innovative Orthogonal Projection Operator wavelet kernel is developed for support vector learning. In the framework of multi-resolution analysis, the proposed wavelet kernel can easily fulfill the multi-scale, multidimensional learning to estimate complex dependencies. The peculiar advantage of the wavelet kernel developed in this paper lies in its expressivity in closed-form, which greatly facilitates its application in kernel learning. To our best knowledge, it is the first closed-form Orthogonal Projection wavelet kernel in the literature. In the scenario of linear programming support vector learning, the proposed closed-form Projection Operator wavelet kernel is used to identify a parallel model of a benchmark nonlinear dynamical system. A simulation study confirms its superiority in model accuracy and sparsity.

Jing Sun - One of the best experts on this subject based on the ideXlab platform.

  • Multiscale Support Vector Learning With Projection Operator Wavelet Kernel for Nonlinear Dynamical System Identification
    IEEE transactions on neural networks and learning systems, 2016
    Co-Authors: Jing Sun, Kenneth Butts
    Abstract:

    A giant leap has been made in the past couple of decades with the introduction of kernel-based learning as a mainstay for designing effective nonlinear computational learning algorithms. In view of the geometric interpretation of conditional expectation and the ubiquity of multiscale characteristics in highly complex nonlinear dynamic systems [1] – [3] , this paper presents a new Orthogonal Projection Operator wavelet kernel, aiming at developing an efficient computational learning approach for nonlinear dynamical system identification. In the framework of multiresolution analysis, the proposed Projection Operator wavelet kernel can fulfill the multiscale, multidimensional learning to estimate complex dependencies. The special advantage of the Projection Operator wavelet kernel developed in this paper lies in the fact that it has a closed-form expression, which greatly facilitates its application in kernel learning. To the best of our knowledge, it is the first closed-form Orthogonal Projection wavelet kernel reported in the literature. It provides a link between grid-based wavelets and mesh-free kernel-based methods. Simulation studies for identifying the parallel models of two benchmark nonlinear dynamical systems confirm its superiority in model accuracy and sparsity.

Suman Majumdar - One of the best experts on this subject based on the ideXlab platform.

  • on the conditional distribution of a multivariate normal given a transformation the linear case
    Heliyon, 2019
    Co-Authors: Rajeshwari Majumdar, Suman Majumdar
    Abstract:

    Abstract We show that the Orthogonal Projection Operator onto the range of the adjoint T ⁎ of a linear Operator T can be represented as UT, where U is an invertible linear Operator. Given a Normal random vector Y and a linear Operator T, we use this representation to obtain a linear Operator T ˆ such that T ˆ Y is independent of TY and Y − T ˆ Y is an affine function of TY. We then use this decomposition to prove that the conditional distribution of a Normal random vector Y given T Y , where T is a linear transformation, is again a multivariate Normal distribution. This result is equivalent to the well-known result that given a k-dimensional component of a n-dimensional Normal random vector, where k n , the conditional distribution of the remaining ( n − k ) -dimensional component is a ( n − k ) -dimensional multivariate Normal distribution, and sets the stage for approximating the conditional distribution of Y given g ( Y ) , where g is a continuously differentiable vector field.

  • On the conditional distribution of a multivariate Normal given a transformation – the linear case
    Elsevier, 2019
    Co-Authors: Rajeshwari Majumdar, Suman Majumdar
    Abstract:

    We show that the Orthogonal Projection Operator onto the range of the adjoint T⁎ of a linear Operator T can be represented as UT, where U is an invertible linear Operator. Given a Normal random vector Y and a linear Operator T, we use this representation to obtain a linear Operator Tˆ such that TˆY is independent of TY and Y−TˆY is an affine function of TY. We then use this decomposition to prove that the conditional distribution of a Normal random vector Y given TY, where T is a linear transformation, is again a multivariate Normal distribution. This result is equivalent to the well-known result that given a k-dimensional component of a n-dimensional Normal random vector, where k

  • On the Conditional Distribution of a Multivariate Normal given a Transformation - the Linear Case
    arXiv: Statistics Theory, 2017
    Co-Authors: Rajeshwari Majumdar, Suman Majumdar
    Abstract:

    We show that the Orthogonal Projection Operator onto the range of the adjoint of a linear Operator $T$ can be represented as $UT,$ where $U$ is an invertible linear Operator. Using this representation we obtain a decomposition of a Normal random vector $Y$ as the sum of a linear transformation of $Y$ that is independent of $TY$ and an affine transformation of $TY$. We then use this decomposition to prove that the conditional distribution of a Normal random vector $Y$ given a linear transformation $\mathcal{T}Y$ is again a multivariate Normal distribution. This result is equivalent to the well-known result that given a $k$-dimensional component of a $n$-dimensional Normal random vector, where $k

  • on the conditional distribution of a multivariate normal given a transformation the linear case
    arXiv: Statistics Theory, 2017
    Co-Authors: Rajeshwari Majumdar, Suman Majumdar
    Abstract:

    We show that the Orthogonal Projection Operator onto the range of the adjoint of a linear Operator $T$ can be represented as $UT,$ where $U$ is an invertible linear Operator. Using this representation we obtain a decomposition of a Normal random vector $Y$ as the sum of a linear transformation of $Y$ that is independent of $TY$ and an affine transformation of $TY$. We then use this decomposition to prove that the conditional distribution of a Normal random vector $Y$ given a linear transformation $\mathcal{T}Y$ is again a multivariate Normal distribution. This result is equivalent to the well-known result that given a $k$-dimensional component of a $n$-dimensional Normal random vector, where $kvector field.

  • On the regular conditional distribution of a multivariate Normal given a linear transformation
    arXiv: Statistics Theory, 2016
    Co-Authors: Rajeshwari Majumdar, Suman Majumdar
    Abstract:

    We show that the Orthogonal Projection Operator onto the range of the adjoint of a linear Operator T can be represented as UT, where U is an invertible linear Operator. Using this representation we obtain a decomposition of a multivariate Normal random variable Y as the sum of a linear transformation of Y that is independent of TY and an affine transformation of TY. We then use this decomposition to prove that the regular conditional distribution of a multivariate Normal random variable Y given a linear transformation TY is again a multivariate Normal distribution. This result is equivalent to the well-known result that given a k-dimensional component of a n-dimensional multivariate Normal random variable, where k < n, the regular conditional distribution of the remaining (n - k)-dimensional component is a (n - k)-dimensional multivariate Normal distribution.

Barry D. Van Veen - One of the best experts on this subject based on the ideXlab platform.

  • Beamforming and the Gram-Schmidt Preprocessor
    1994
    Co-Authors: Rajesh Sharma, Barry D. Van Veen
    Abstract:

    An alternate derivation of the modular structure for linearly constrained minimum variance beamforming proposed in (l) is presented using a vector space approach. This approach eliminates the tedious algebra employed in (l) and establishes the relationship between the modular structure and the Gram-Schmidt preprocessor (3). The modular structure is obtained using a factorization of the Orthogonal Projection Operator in Hilbert space. The Gram-Schmidt preprocessor is a special case of the general modular decomposition. It is also shown that these structures offer computational efficiencies when multiple beamformers are implemented simultaneously.

  • Large modular structures for adaptive beamforming and the Gram-Schmidt preprocessor
    IEEE Transactions on Signal Processing, 1994
    Co-Authors: Rajesh Sharma, Barry D. Van Veen
    Abstract:

    An alternate derivation of the modular structure for linearly constrained minimum variance beamforming proposed in Liu and Van Veen (1991) is presented using a vector space approach. This approach eliminates the tedious algebra employed in that paper and establishes the relationship between the modular structure and the Gram-Schmidt preprocessor (Mozingo and Miller, 1980). The modular structure is obtained using a factorization of the Orthogonal Projection Operator in Hilbert space. The Gram-Schmidt preprocessor is a special case of the general modular decomposition. It is also shown that these structures offer computational efficiencies when multiple beamformers are implemented simultaneously. >