Grassmann

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 15168 Experts worldwide ranked by ideXlab platform

V. A. Soroka - One of the best experts on this subject based on the ideXlab platform.

  • supersymmetry and the odd poisson bracket
    arXiv: High Energy Physics - Theory, 2002
    Co-Authors: V. A. Soroka
    Abstract:

    Some applications of the odd Poisson bracket developed by Kharkov's theorists are represented, including the reformulation of classical Hamiltonian dynamics, the description of hydrodynamics as a Hamilton system by means of the odd bracket and the dynamics formulation with the Grassmann-odd Lagrangian. Quantum representations of the odd bracket are also constructed and applied for the quantization of classical systems based on the odd bracket and for the realization of the idea of a composite spinor structure of space-time. At last, the linear odd bracket, corresponding to a semi-simple Lie group, is introduced on the Grassmann algebra.

  • degenerate odd poisson bracket on Grassmann variables
    Physics of Atomic Nuclei, 2000
    Co-Authors: V. A. Soroka
    Abstract:

    A linear degenerate odd Poisson bracket (antibracket) realized solely on Grassmann variables is proposed. It is revealed that this bracket has at once three Grassmann-odd nilpotent Δ-like differential operators of the first, second and third orders with respect to the Grassmann derivatives. It is shown that these Δ-like operators, together with the Grassmann-odd nilpotent Casimir function of this bracket, form a finite-dimensional Lie superalgebra.

  • linear odd poisson bracket on Grassmann variables
    Physics Letters B, 1999
    Co-Authors: V. A. Soroka
    Abstract:

    Abstract A linear odd Poisson bracket (antibracket) realized solely in terms of Grassmann variables is suggested. It is revealed that the bracket, which corresponds to a semi-simple Lie group, has at once three Grassmann-odd nilpotent Δ -like differential operators of the first, the second and the third orders with respect to Grassmann derivatives, in contrast with the canonical odd Poisson bracket having the only Grassmann-odd nilpotent differential Δ -operator of the second order. It is shown that these Δ -like operators together with a Grassmann-odd nilpotent Casimir function of this bracket form a finite-dimensional Lie superalgebra.

  • linear odd poisson bracket on Grassmann variables
    arXiv: High Energy Physics - Theory, 1998
    Co-Authors: V. A. Soroka
    Abstract:

    A linear odd Poisson bracket (antibracket) realized solely in terms of Grassmann variables is suggested. It is revealed that the bracket, which corresponds to a semi-simple Lie group, has at once three Grassmann-odd nilpotent $\Delta$-like differential operators of the first, the second and the third orders with respect to Grassmann derivatives, in contrast with the canonical odd Poisson bracket having the only Grassmann-odd nilpotent differential $\Delta$-operator of the second order. It is shown that these $\Delta$-like operators together with a Grassmann-odd nilpotent Casimir function of this bracket form a finite-dimensional Lie superalgebra.

  • degenerate odd poisson bracket on Grassmann variables
    arXiv: High Energy Physics - Theory, 1998
    Co-Authors: V. A. Soroka
    Abstract:

    A linear degenerate odd Poisson bracket (antibracket) realized solely on Grassmann variables is presented. It is revealed that this bracket has at once three nilpotent $\Delta$-like differential operators of the first, the second and the third orders with respect to the Grassmann derivatives. It is shown that these $\Delta$-like operators together with the Grassmann-odd nilpotent Casimir function of this bracket form a finite-dimensional Lie superalgebra.

Berkant Savas - One of the best experts on this subject based on the ideXlab platform.

  • A NEWTON–Grassmann METHOD FOR COMPUTING THE BEST MULTILINEAR RANK-(r1, r2, r3) APPROXIMATION OF A
    2013
    Co-Authors: Lars Eld, Berkant Savas
    Abstract:

    Abstract. We derive a Newton method for computing the best rank-(r1,r2,r3) approximation of a given J × K × L tensor A. The problem is formulated as an approximation problem on a product of Grassmann manifolds. Incorporating the manifold structure into Newton’s method ensures that all iterates generated by the algorithm are points on the Grassmann manifolds. We also introduce a consistent notation for matricizing a tensor, for contracted tensor products and some tensor-algebraic manipulations, which simplify the derivation of the Newton equations and enable straightforward algorithmic implementation. Experiments show a quadratic convergence rate for the Newton–Grassmann algorithm

  • A NEWTON-Grassmann METHOD FOR COMPUTING THE BEST MULTI-LINEAR RANK-(R_1, R_2, R_3) APPROXIMATION OF A Tensor
    2010
    Co-Authors: Lars Elden, Berkant Savas
    Abstract:

    We derive a Newton method for computing the best rank-(r_1, r_2, r_3) approximation of a given J × K × L tensor A. The problem is formulated as an approximation problem on a product of Grassmann manifolds. Incorporating the manifold structure into Newton’s method ensures that all iterates generated by the algorithm are points on the Grassmann manifolds. We also introduce a consistent notation for matricizing a tensor, for contracted tensor products and some tensor-algebraic manipulations, which simplify the derivation of the Newton equations and enable straightforward algorithmic implementation. Experiments show a quadratic convergence rate for the Newton-Grassmann algorithm

  • A NEWTON-Grassmann METHOD FOR COMPUTING THE BEST MULTILINEAR RANK-(R1, R2, R3) APPROXIMATION OF
    2010
    Co-Authors: A Tensor, Lars Elden, Berkant Savas
    Abstract:

    Abstract. We derive a Newton method for computing the best rank-(r1, r2, r3) approximation of a given J × K × L tensor A. The problem is formulated as an approximation problem on a product of Grassmann manifolds. Incorporating the manifold structure into Newton’s method ensures that all iterates generated by the algorithm are points on the Grassmann manifolds. We also introduce a consistent notation for matricizing a tensor, for contracted tensor products and some tensor-algebraic manipulations, which simplify the derivation of the Newton equations and enable straightforward algorithmic implementation. Experiments show a quadratic convergence rate for the Newton-Grassmann algorithm. 1. Introduction. The problem of approximating a tensor A ∈ R J×K×L by another tensor B of equal dimensions but of lower rank

  • a newton Grassmann method for computing the best multilinear rank r_1 r_2 r_3 approximation of a tensor
    SIAM Journal on Matrix Analysis and Applications, 2009
    Co-Authors: Lars Elden, Berkant Savas
    Abstract:

    We derive a Newton method for computing the best rank-$(r_1,r_2,r_3)$ approximation of a given $J\times K\times L$ tensor $\mathcal{A}$. The problem is formulated as an approximation problem on a product of Grassmann manifolds. Incorporating the manifold structure into Newton's method ensures that all iterates generated by the algorithm are points on the Grassmann manifolds. We also introduce a consistent notation for matricizing a tensor, for contracted tensor products and some tensor-algebraic manipulations, which simplify the derivation of the Newton equations and enable straightforward algorithmic implementation. Experiments show a quadratic convergence rate for the Newton-Grassmann algorithm.

Kaibin Huang - One of the best experts on this subject based on the ideXlab platform.

  • automatic recognition of space time constellations by learning on the Grassmann manifold
    Global Communications Conference, 2018
    Co-Authors: Guangxu Zhu, Jiayao Zhang, Kaibin Huang
    Abstract:

    Recent breakthroughs in machine learning especially artificial intelligence shift the paradigm of wireless communication towards intelligence radios. One of their core operations is automatic modulation recognition (AMR). Existing research focuses on coherent modulation schemes such as QAM, PSK and FSK. The AMR of (non- coherent) space-time modulation remains an uncharted area despite its wide deployment in modern multiple-input-multiple-output (MIMO) systems. The scheme using a so called Grassmann constellation (comprising unitary matrices) enables rate- enhancement using multi-antennas and blind detection. In this work, we propose an AMR approach for Grassmann constellation based on data clustering, which differs from traditional AMR based on classification using a modulation database. The approach allows algorithms for clustering on the Grassmann manifold (or the Grassmannian), such as Grassmann K-means, originally developed for computer vision to be applied to AMR. In this paper, the maximum- likelihood (ML) Grassmann constellation detection is proved to be equivalent to clustering on the Grassmannian. Thereby, a well-known machine-learning result that was originally established only for the Euclidean space is rediscovered for the Grassmannian.

  • automatic recognition of space time constellations by learning on the Grassmann manifold
    2018
    Co-Authors: Guangxu Zhu, Jiayao Zhang, Kaibin Huang
    Abstract:

    Recent breakthroughs in machine learning especially artificial intelligence shift the paradigm of wireless communication towards intelligence radios. One of their core operations is automatic modulation recognition (AMR). Existing research focuses on coherent modulation schemes such as QAM, PSK and FSK. The AMR of (non-coherent) space-time modulation remains an uncharted area despite its wide deployment in modern multiple-input-multiple-output (MIMO) systems. The scheme using a so called Grassmann constellation enables rate-enhancement using multi-antennas and blind detection. In this work, we propose an AMR approach for Grassmann constellation based on data clustering, which differs from traditional AMR based on classification using a modulation database. The approach allows algorithms for clustering on the Grassmann manifold, such as Grassmann K-means and depth-first search, originally developed for computer vision to be applied to AMR. We further develop an analytical framework for studying and designing these algorithms in the context of AMR. First, the maximum-likelihood Grassmann constellation detection is proved to be equivalent to clustering on the Grassmannian. Thereby, a well-known machine-learning result that was originally established only for the Euclidean space is rediscovered for the Grassmannian. Next, despite a rich literature on algorithmic design, theoretical analysis of data clustering is largely overlooked due to the lack of tractable techniques. We tackle the challenge by introducing probabilistic metrics for measuring the inter-cluster separability and intra-cluster connectivity of received space-time symbols and deriving them using tools from differential geometry and Grassmannian packing. The results provide useful insights into the effects of various parameters ranging from the signal-to-noise ratio to constellation size, facilitating algorithmic design.

  • Automatic Recognition of Space-Time Constellations by Learning on the Grassmann Manifold
    IEEE Transactions on Signal Processing, 2018
    Co-Authors: Guangxu Zhu, Jiayao Zhang, Kaibin Huang
    Abstract:

    Recent breakthroughs in machine learning shift the paradigm of wireless communication towards intelligence radios. One of their core operations is automatic modulation recognition (AMR). Existing research focuses on coherent modulation schemes such as QAM and FSK. The AMR of (noncoherent) space-time modulation remains an uncharted area despite its deployment in modern multiple-input-multiple-output (MIMO) systems. The scheme using a so-called Grassmann constellation enables rate enhancement. In this paper, we propose an AMR approach for Grassmann constellation based on data clustering, which differs from traditional AMR based on classification using a modulation database. The approach allows algorithms for clustering on the Grassmann manifold (or the Grassmannian), such as Grassmann K-means and depth-first search, to be applied to AMR. We further develop an analytical framework for studying and designing these algorithms in the context of AMR. First, the expectation-maximization algorithm for Grassmann constellation detection is proved to be equivalent to clustering (K-means) on the Grassmannian for a high SNR. Thereby, a well-known machine-learning result that was originally established only for the Euclidean space is rediscovered for the Grassmannian. Next, we tackle the challenge on theoretical analysis of data clustering by introducing probabilistic metrics for measuring the inter-cluster separability and intra-cluster connectivity of received space-time symbols and deriving them using tools from differential geometry. The results provide useful insights into the effects of various parameters ranging from the signal-to-noise ratio to constellation size, facilitating algorithmic design.

Gangyao Kuang - One of the best experts on this subject based on the ideXlab platform.

  • classification via sparse representation of steerable wavelet frames on Grassmann manifold application to target recognition in sar image
    IEEE Transactions on Image Processing, 2017
    Co-Authors: Ganggang Dong, Gangyao Kuang, Na Wang, Wei Wang
    Abstract:

    Automatic target recognition has been widely studied over the years, yet it is still an open problem. The main obstacle consists in extended operating conditions, e.g. ., depression angle change, configuration variation, articulation, and occlusion. To deal with them, this paper proposes a new classification strategy. We develop a new representation model via the steerable wavelet frames. The proposed representation model is entirely viewed as an element on Grassmann manifolds. To achieve target classification, we embed Grassmann manifolds into an implicit reproducing Kernel Hilbert space (RKHS), where the kernel sparse learning can be applied. Specifically, the mappings of training sample in RKHS are concatenated to form an overcomplete dictionary. It is then used to encode the counterpart of query as a linear combination of its atoms. By designed Grassmann kernel function, it is capable to obtain the sparse representation, from which the inference can be reached. The novelty of this paper comes from: 1) the development of representation model by the set of directional components of Riesz transform; 2) the quantitative measure of similarity for proposed representation model by Grassmann metric; and 3) the generation of global kernel function by Grassmann kernel. Extensive comparative studies are performed to demonstrate the advantage of proposed strategy.

  • sar target recognition via sparse representation of monogenic signal on Grassmann manifolds
    IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2016
    Co-Authors: Ganggang Dong, Gangyao Kuang
    Abstract:

    In this paper, classification via sparse representation of monogenic signal on Grassmann manifolds is presented for target recognition in SAR image. To capture the broad spectral information with maximal spatial localization of SAR image, a recently proposed vector-valued analytic signal, namely monogenic signal is exploited. Different from the conventional methods, where a single feature descriptor is generated using the monogenic signal in an Euclidean space, the multiple components of monogenic signal at various scale spaces are viewed as points on a special type of Riemannian manifolds, Grassmann manifolds. The similarity between a pair of patterns (points) is measured by Grassmann distance metric. To exploit the nonlinear geometry structure further, we embed the sets of monogenic components into an implicit reproducing kernel Hilbert space (RKHS), where the kernel-based sparse signal modeling can be learnt to reach the inference. Specifically, the sets of monogenic components resulting from the training samples are concatenated first to build a redundant dictionary. Then, the counterpart of the query is efficiently approximated by superposition of atoms of the dictionary. Notably, the representation coefficients of superposition are very parsimonious. The inference is drawn by evaluating which class of training patterns could recover the query as accurately as possible. The novelty of this paper comes from 1) the development of Grassmann manifolds formed by the multiresolution monogenic signal; 2) the definition of similarity between the sets of monogenic components on Grassmann manifolds for target recognition; 3) the generalization of sparse signal modeling on Grassmann manifold; and 4) multiple comparative experiments for performance assessment.

Ganggang Dong - One of the best experts on this subject based on the ideXlab platform.

  • classification via sparse representation of steerable wavelet frames on Grassmann manifold application to target recognition in sar image
    IEEE Transactions on Image Processing, 2017
    Co-Authors: Ganggang Dong, Gangyao Kuang, Na Wang, Wei Wang
    Abstract:

    Automatic target recognition has been widely studied over the years, yet it is still an open problem. The main obstacle consists in extended operating conditions, e.g. ., depression angle change, configuration variation, articulation, and occlusion. To deal with them, this paper proposes a new classification strategy. We develop a new representation model via the steerable wavelet frames. The proposed representation model is entirely viewed as an element on Grassmann manifolds. To achieve target classification, we embed Grassmann manifolds into an implicit reproducing Kernel Hilbert space (RKHS), where the kernel sparse learning can be applied. Specifically, the mappings of training sample in RKHS are concatenated to form an overcomplete dictionary. It is then used to encode the counterpart of query as a linear combination of its atoms. By designed Grassmann kernel function, it is capable to obtain the sparse representation, from which the inference can be reached. The novelty of this paper comes from: 1) the development of representation model by the set of directional components of Riesz transform; 2) the quantitative measure of similarity for proposed representation model by Grassmann metric; and 3) the generation of global kernel function by Grassmann kernel. Extensive comparative studies are performed to demonstrate the advantage of proposed strategy.

  • sar target recognition via sparse representation of monogenic signal on Grassmann manifolds
    IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2016
    Co-Authors: Ganggang Dong, Gangyao Kuang
    Abstract:

    In this paper, classification via sparse representation of monogenic signal on Grassmann manifolds is presented for target recognition in SAR image. To capture the broad spectral information with maximal spatial localization of SAR image, a recently proposed vector-valued analytic signal, namely monogenic signal is exploited. Different from the conventional methods, where a single feature descriptor is generated using the monogenic signal in an Euclidean space, the multiple components of monogenic signal at various scale spaces are viewed as points on a special type of Riemannian manifolds, Grassmann manifolds. The similarity between a pair of patterns (points) is measured by Grassmann distance metric. To exploit the nonlinear geometry structure further, we embed the sets of monogenic components into an implicit reproducing kernel Hilbert space (RKHS), where the kernel-based sparse signal modeling can be learnt to reach the inference. Specifically, the sets of monogenic components resulting from the training samples are concatenated first to build a redundant dictionary. Then, the counterpart of the query is efficiently approximated by superposition of atoms of the dictionary. Notably, the representation coefficients of superposition are very parsimonious. The inference is drawn by evaluating which class of training patterns could recover the query as accurately as possible. The novelty of this paper comes from 1) the development of Grassmann manifolds formed by the multiresolution monogenic signal; 2) the definition of similarity between the sets of monogenic components on Grassmann manifolds for target recognition; 3) the generalization of sparse signal modeling on Grassmann manifold; and 4) multiple comparative experiments for performance assessment.