Geometric Mean

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 309 Experts worldwide ranked by ideXlab platform

Yuanming Shi - One of the best experts on this subject based on the ideXlab platform.

  • Geometric Mean of partial positive definite matrices with missing entries
    Linear and Multilinear Algebra, 2019
    Co-Authors: Hayoung Choi, Sejong Kim, Yuanming Shi
    Abstract:

    In this paper the Geometric Mean of partial positive definite matrices with missing entries is considered. The weighted Geometric Mean of two sets of positive matrices is defined, and we show wheth...

  • Geometric Mean of Partial Positive Definite Matrices with Missing Entries
    arXiv: Functional Analysis, 2018
    Co-Authors: Hayoung Choi, Sejong Kim, Yuanming Shi
    Abstract:

    In this paper the Geometric Mean of partial positive definite matrices with missing entries is considered. The weighted Geometric Mean of two sets of positive matrices is defined, and we show whether such a Geometric Mean holds certain properties which the weighted Geometric Mean of two positive definite matrices satisfies. Additionally, counterexamples demonstrate that certain properties do not hold. A Loewner order on partial Hermitian matrices is also defined. The known results for the maximum determinant positive completion are developed with an integral representation, and the results are applied to the weighted Geometric Mean of two partial positive definite matrices with missing entries. Moreover, a relationship between two positive definite completions is established with respect to their determinants, showing relationship between their entropy for a zero-Mean,multivariate Gaussian distribution. Computational results as well as one application are shown.

Xiaojing Zhang - One of the best experts on this subject based on the ideXlab platform.

Sejong Kim - One of the best experts on this subject based on the ideXlab platform.

  • Geometric Mean of partial positive definite matrices with missing entries
    Linear and Multilinear Algebra, 2019
    Co-Authors: Hayoung Choi, Sejong Kim, Yuanming Shi
    Abstract:

    In this paper the Geometric Mean of partial positive definite matrices with missing entries is considered. The weighted Geometric Mean of two sets of positive matrices is defined, and we show wheth...

  • Geometric Mean block matrices
    Linear Algebra and its Applications, 2019
    Co-Authors: Sejong Kim, Hosoo Lee, Yongdo Lim
    Abstract:

    Abstract We consider an m × m block matrix G with entries A i # A j where A 1 , … , A m are positive definite matrices of fixed size and A#B is the Geometric Mean of positive definite matrix A and B. We show that G is positive semidefinite if and only if the family of A 1 , … , A m is Γ-commuting; it can be transformed to a commuting family of positive definite matrices by a congruence transformation. This result via Γ-commuting families provides not only a kind of positive semidefinite block matrices but also a new extremal characterization of two variable Geometric Mean in terms of multivariate block matrices.

  • Geometric Mean of Partial Positive Definite Matrices with Missing Entries
    arXiv: Functional Analysis, 2018
    Co-Authors: Hayoung Choi, Sejong Kim, Yuanming Shi
    Abstract:

    In this paper the Geometric Mean of partial positive definite matrices with missing entries is considered. The weighted Geometric Mean of two sets of positive matrices is defined, and we show whether such a Geometric Mean holds certain properties which the weighted Geometric Mean of two positive definite matrices satisfies. Additionally, counterexamples demonstrate that certain properties do not hold. A Loewner order on partial Hermitian matrices is also defined. The known results for the maximum determinant positive completion are developed with an integral representation, and the results are applied to the weighted Geometric Mean of two partial positive definite matrices with missing entries. Moreover, a relationship between two positive definite completions is established with respect to their determinants, showing relationship between their entropy for a zero-Mean,multivariate Gaussian distribution. Computational results as well as one application are shown.

  • Relative operator entropy related with the spectral Geometric Mean
    Analysis and Mathematical Physics, 2015
    Co-Authors: Sejong Kim, Hosoo Lee
    Abstract:

    We consider the relative operator entropy constructed by the spectral Geometric Mean and see its properties analogous to those of the Tsallis relative operator entropy by the usual Geometric Mean. Furthermore, we define the quantum relative entropy constructed by the spectral Geometric Mean and derive its subadditivity under tensor product.

Maher Moakher - One of the best experts on this subject based on the ideXlab platform.

  • Approximate Joint Diagonalization and Geometric Mean of Symmetric Positive Definite Matrices
    PLoS ONE, 2015
    Co-Authors: Marco Congedo, Bijan Afsari, Alexandre Barachant, Maher Moakher
    Abstract:

    We explore the connection between two problems that have arisen independently in the signal processing and related fields: the estimation of the Geometric Mean of a set of symmetric positive definite (SPD) matrices and their approximate joint diagonalization (AJD). Today there is a considerable interest in estimating the Geometric Mean of a SPD matrix set in the manifold of SPD matrices endowed with the Fisher information metric. The resulting Mean has several important invariance properties and has proven very useful in diverse engineering applications such as biomedical and image data processing. While for two SPD matrices the Mean has an algebraic closed form solution, for a set of more than two SPD matrices it can only be estimated by iterative algorithms. However, none of the existing iterative algorithms feature at the same time fast convergence, low computational complexity per iteration and guarantee of convergence. For this reason, recently other definitions of Geometric Mean based on symmetric divergence measures, such as the Bhattacharyya divergence, have been considered. The resulting Means, although possibly useful in practice, do not satisfy all desirable invariance properties. In this paper we consider Geometric Means of co-variance matrices estimated on high-dimensional time-series, assuming that the data is generated according to an instantaneous mixing model, which is very common in signal processing. We show that in these circumstances we can approximate the Fisher information Geometric Mean by employing an efficient AJD algorithm. Our approximation is in general much closer to the Fisher information Geometric Mean as compared to its competitors and verifies many invariance properties. Furthermore, convergence is guaranteed, the computational complexity is low and the convergence rate is quadratic. The accuracy of this new Geometric Mean approximation is demonstrated by Means of simulations.

  • Approximate Joint Diagonalization and Geometric Mean of Symmetric Positive
    2015
    Co-Authors: Marco Congedo, Bijan Afsari, Alexandre Barachant, Maher Moakher
    Abstract:

    We explore the connection between two problems that have arisen independently in the signal processing and related fields: the estimation of the Geometric Mean of a set of symmetric positive definite (SPD) matrices and their approximate joint diagonalization (AJD). Today there is a considerable interest in estimating the Geometric Mean of a SPD matrix set in the manifold of SPD matrices endowed with the Fisher information metric. The resulting Mean has several important invariance properties and has proven very useful in diverse engineering applications such as biomedical and image data processing. While for two SPD matrices the Mean has an algebraic closed form solution, for a set of more than two SPD matrices it can only be estimated by iterative algorithms. However, none of the existing iterative algorithms feature at the same time fast convergence, low computational complexity per iteration and guarantee of convergence. For this reason, recently other definitions of Geometric Mean based on symmetric divergence measures, such as the Bhattacharyya divergence, have been considered. The resulting Means, although possibly useful in practice, do not satisfy all desirable invariance properties. In this paper we consider Geometric Means of covariance matrices estimated on high-dimensional time-series, assuming that the data is generated according to an instantaneous mixing model, which is very common in signal processing. We show that in these circumstances we can approximate the Fisher information Geometric Mean by employing an efficient AJD algorithm. Our approximation is in general much closer to the Fisher information Geometric Mean as compared to its competitors and verifies many invariance properties. Furthermore, convergence is guaranteed, the computational complexity is low and the convergence rate is quadratic. The accuracy of this new Geometric Mean approximation is demonstrated by Means of simulations.

Yongdo Lim - One of the best experts on this subject based on the ideXlab platform.

  • Geometric Mean block matrices
    Linear Algebra and its Applications, 2019
    Co-Authors: Sejong Kim, Hosoo Lee, Yongdo Lim
    Abstract:

    Abstract We consider an m × m block matrix G with entries A i # A j where A 1 , … , A m are positive definite matrices of fixed size and A#B is the Geometric Mean of positive definite matrix A and B. We show that G is positive semidefinite if and only if the family of A 1 , … , A m is Γ-commuting; it can be transformed to a commuting family of positive definite matrices by a congruence transformation. This result via Γ-commuting families provides not only a kind of positive semidefinite block matrices but also a new extremal characterization of two variable Geometric Mean in terms of multivariate block matrices.

  • A Geometric Mean of Parameterized Arithmetic and Harmonic Means of Convex Functions
    Abstract and Applied Analysis, 2012
    Co-Authors: Sangho Kum, Yongdo Lim
    Abstract:

    The notion of the Geometric Mean of two positive reals is extended by Ando (1978) to the case of positive semidefinite matrices and . Moreover, an interesting generalization of the Geometric Mean of and to convex functions was introduced by Atteia and Raissouli (2001) with a different viewpoint of convex analysis. The present work aims at providing a further development of the Geometric Mean of convex functions due to Atteia and Raissouli (2001). A new algorithmic self-dual operator for convex functions named “the Geometric Mean of parameterized arithmetic and harmonic Means of convex functions” is proposed, and its essential properties are investigated.

  • Weighted Geometric Mean of n-operators with n-parameters
    Linear Algebra and its Applications, 2010
    Co-Authors: Chang-do Jung, Hosoo Lee, Yongdo Lim, Takeaki Yamazaki
    Abstract:

    Abstract We shall consider a weighted Geometric Mean of n -operators with n -parameters. It is based on the Geometric Mean defined in our previous paper [C. Jung, H. Lee, T. Yamazaki, On a new construction of Geometric Mean of n -operators, Linear Algebra Appl. 431 (2009) 1477–1488]. Then we shall show that its weights can be obtained by simple forms in the commutative operators case. Some properties of the weighted Geometric Mean are obtained.

  • The Geometric Mean, Matrices, Metrics, and More
    The American Mathematical Monthly, 2001
    Co-Authors: Jimmie D. Lawson, Yongdo Lim
    Abstract:

    (2001). The Geometric Mean, Matrices, Metrics, and More. The American Mathematical Monthly: Vol. 108, No. 9, pp. 797-812.