Orthogonal Similarity

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 273 Experts worldwide ranked by ideXlab platform

Nicola Mastronardi - One of the best experts on this subject based on the ideXlab platform.

  • On Computing Eigenvectors of Symmetric Tridiagonal Matrices
    Structured Matrices in Numerical Linear Algebra, 2019
    Co-Authors: Nicola Mastronardi, Harold Taeter, Paul Van Dooren
    Abstract:

    The computation of the eigenvalue decomposition of symmetric matrices is one of the most investigated problems in numerical linear algebra. For a matrix of moderate size, the customary procedure is to reduce it to a symmetric tridiagonal one by means of an Orthogonal Similarity transformation and then compute the eigendecomposition of the tridiagonal matrix.

  • Computing the Jordan Structure of an Eigenvalue
    SIAM Journal on Matrix Analysis and Applications, 2017
    Co-Authors: Nicola Mastronardi, Paul Van Dooren
    Abstract:

    In this paper we revisit the problem of finding an Orthogonal Similarity transformation that puts an $n\times n$ matrix $A$ in a block upper-triangular form that reveals its Jordan structure at a p...

  • The Antitriangular Factorization of Symmetric Matrices
    SIAM Journal on Matrix Analysis and Applications, 2013
    Co-Authors: Nicola Mastronardi, Paul Van Dooren
    Abstract:

    Indefinite symmetric matrices occur in many applications, such as optimization, least squares problems, partial differential equations, and variational problems. In these applications one is often interested in computing a factorization of the indefinite matrix that puts into evidence the inertia of the matrix or possibly provides an estimate of its eigenvalues. In this paper we propose an algorithm that provides this information for any symmetric indefinite matrix by transforming it to a block antitriangular form using Orthogonal Similarity transformations. We also show that the algorithm is backward stable and has a complexity that is comparable to existing matrix decompositions for dense indefinite matrices.

  • On the convergence properties of the Orthogonal Similarity transformations to tridiagonal and semiseparable (plus diagonal) form
    Numerische Mathematik, 2006
    Co-Authors: Raf Vandebril, Marc Van Barel, Ellen Van Camp, Nicola Mastronardi
    Abstract:

    In this paper, we will compare the convergence properties of three basic reduction methods, by placing them in a general framework. It covers the reduction to tridiagonal, semiseparable and semiseparable plus diagonal form. These reductions are often used as the first step in the computation of the eigenvalues and/or eigenvectors of arbitrary matrices. In this way, the calculation of the eigenvalues using, for example, the QR-algorithm reduces in complexity. First we will investigate the convergence properties of these three reduction algorithms. It will be shown that for the partially reduced matrices at step k of any of these reduction algorithms, the lower right k  ×  k (already reduced) sub-block will have the Lanczos–Ritz values, w.r.t. a certain starting vector. It will also be shown that the reductions to semiseparable and to semiseparable plus diagonal form have an extra convergence behavior a special type of subspace iteration is performed on the lower right k  ×  k submatrix, which contains these Ritz-values. Secondly we look in more detail at the behavior of the involved subspace iteration. It will be shown that the reduction method can be interpreted as a nested type of multi-shift iteration. Theoretical results will be presented, making it possible to predict the convergence behavior of these reduction algorithms. Also a theoretical bound on the convergence rate is presented. Finally we illustrate by means of numerical examples, how it is possible to tune the convergence behavior such that it can become a powerful tool for certain applications.

  • Orthogonal Similarity transformation of a symmetric matrix into a diagonal plus semiseparable one with free choice of the diagonal
    Numerische Mathematik, 2006
    Co-Authors: Raf Vandebril, Marc Van Barel, Ellen Van Camp, Nicola Mastronardi
    Abstract:

    In this paper we describe an Orthogonal Similarity transformation for transforming arbitrary symmetric matrices into a diagonal-plus-semiseparable matrix, where we can freely choose the diagonal. Very recently an algorithm was proposed for transforming arbitrary symmetric matrices into similar semiseparable ones. This reduction is strongly connected to the reduction to tridiagonal form. The class of semiseparable matrices can be considered as a subclass of the diagonalplus- semiseparable matrices. Therefore we can interpret the proposed algorithm here as an extension of the reduction to semiseparable form. A numerical experiment is performed comparing thereby the accuracy of this reduction algorithm with respect to the accuracy of the traditional reduction to tridiagonal form, and the reduction to semiseparable form. The experiment indicates that all three reduction algorithms are equally accurate. Moreover it is shown in the experiments that asymptotically all the three approaches have the same complexity, i.e. that they have the same factor preceding the n 3 term in the computational complexity. Finally we illustrate that special choices of the diagonal create a specific convergence behavior.

G.-r. Duan - One of the best experts on this subject based on the ideXlab platform.

  • Right coprime factorisations using system upper Hessenberg forms - the multi-input system case
    IEE Proceedings - Control Theory and Applications, 2001
    Co-Authors: G.-r. Duan
    Abstract:

    Based on a method for right coprime factorisations of linear systems using matrix elementary transformations, it is shown that a very simple iteration formula exists for right coprime factorisations of multi-input linear systems in the system upper Hessenberg forms. This formula gives directly the coefficient matrices of the pair of solutions to the right coprime factorisation of the system Hessenberg form, and involves only inverses of a few triangular matrices and some matrix products and summations. Based on this formula, a simple, efficient procedure for determining a right coprime factorisation of a multi-input linear system is proposed, which first converts a given linear system into its system Hessenberg form using some Orthogonal Similarity transformations and then applies the iteration formula to the converted system Hessenberg form. An example demonstrates the use of the approach.

  • Right coprime factorizations using system upper Hessenberg forms-the multi-input system case
    Proceedings of the 39th IEEE Conference on Decision and Control (Cat. No.00CH37187), 1
    Co-Authors: G.-r. Duan
    Abstract:

    Based on a method for right coprime factorizations of linear systems using matrix elementary transformations, it is shown that a very simple iteration formula exists for right coprime factorizations of multi-input linear systems in system upper Hessenberg forms. This formula gives directly the coefficient matrices of the pair of solutions to the right coprime factorization of the system Hessenberg form, and involves only manipulations of inverses of a few triangular matrices and some matrix productions and summations. Based on this formula, a simple, efficient procedure for determining a right coprime factorization of a multi-input linear system is proposed, which first converts a given linear system into its system Hessenberg form using some Orthogonal Similarity transformations and then applies the iteration formula to the converted system Hessenberg form. An example demonstrates the usage of the approach.

Raf Vandebril - One of the best experts on this subject based on the ideXlab platform.

  • On the convergence properties of the Orthogonal Similarity transformations to tridiagonal and semiseparable (plus diagonal) form
    Numerische Mathematik, 2006
    Co-Authors: Raf Vandebril, Marc Van Barel, Ellen Van Camp, Nicola Mastronardi
    Abstract:

    In this paper, we will compare the convergence properties of three basic reduction methods, by placing them in a general framework. It covers the reduction to tridiagonal, semiseparable and semiseparable plus diagonal form. These reductions are often used as the first step in the computation of the eigenvalues and/or eigenvectors of arbitrary matrices. In this way, the calculation of the eigenvalues using, for example, the QR-algorithm reduces in complexity. First we will investigate the convergence properties of these three reduction algorithms. It will be shown that for the partially reduced matrices at step k of any of these reduction algorithms, the lower right k  ×  k (already reduced) sub-block will have the Lanczos–Ritz values, w.r.t. a certain starting vector. It will also be shown that the reductions to semiseparable and to semiseparable plus diagonal form have an extra convergence behavior a special type of subspace iteration is performed on the lower right k  ×  k submatrix, which contains these Ritz-values. Secondly we look in more detail at the behavior of the involved subspace iteration. It will be shown that the reduction method can be interpreted as a nested type of multi-shift iteration. Theoretical results will be presented, making it possible to predict the convergence behavior of these reduction algorithms. Also a theoretical bound on the convergence rate is presented. Finally we illustrate by means of numerical examples, how it is possible to tune the convergence behavior such that it can become a powerful tool for certain applications.

  • Orthogonal Similarity transformation of a symmetric matrix into a diagonal plus semiseparable one with free choice of the diagonal
    Numerische Mathematik, 2006
    Co-Authors: Raf Vandebril, Marc Van Barel, Ellen Van Camp, Nicola Mastronardi
    Abstract:

    In this paper we describe an Orthogonal Similarity transformation for transforming arbitrary symmetric matrices into a diagonal-plus-semiseparable matrix, where we can freely choose the diagonal. Very recently an algorithm was proposed for transforming arbitrary symmetric matrices into similar semiseparable ones. This reduction is strongly connected to the reduction to tridiagonal form. The class of semiseparable matrices can be considered as a subclass of the diagonalplus- semiseparable matrices. Therefore we can interpret the proposed algorithm here as an extension of the reduction to semiseparable form. A numerical experiment is performed comparing thereby the accuracy of this reduction algorithm with respect to the accuracy of the traditional reduction to tridiagonal form, and the reduction to semiseparable form. The experiment indicates that all three reduction algorithms are equally accurate. Moreover it is shown in the experiments that asymptotically all the three approaches have the same complexity, i.e. that they have the same factor preceding the n 3 term in the computational complexity. Finally we illustrate that special choices of the diagonal create a specific convergence behavior.

  • Necessary and sufficient conditions for Orthogonal Similarity transformations to obtain the Arnoli(Lanczos)–Ritz values
    Linear Algebra and its Applications, 2006
    Co-Authors: Raf Vandebril, Marc Van Barel
    Abstract:

    Abstract It is a well-known fact that while reducing a symmetric matrix into a similar tridiagonal one, the already tridiagonal matrix in the partially reduced matrix has as eigenvalues the Lanczos–Ritz values. This behavior is also shared by the reduction algorithm which transforms symmetric matrices via Orthogonal Similarity transformations to semiseparable form. Moreover also the Orthogonal reduction to Hessenberg form has a similar behavior with respect to the Arnoldi–Ritz values. In this paper we investigate the Orthogonal Similarity transformations creating this behavior. Two easy conditions are derived, which provide necessary and sufficient conditions, such that the partially reduced matrices have the desired convergence behavior. The conditions are easy to check as they demand that in every step of the reduction algorithm two particular matrices need to have a zero block.

  • Rank structured matrix operations
    2006
    Co-Authors: Marc Van Barel, Raf Vandebril, Nicola Mastronardi, Steven Delvaux, Yvette Vanberghen
    Abstract:

    Classically, several numerical linear algebra problems are solved by means of tridiagonal (symmetric) and Hessenberg (unsymmetric) matrices. In this talk, it will be shown that a similar role can be played by semiseparable matrices. A matrix is called semiseparable if all submatrices that can be taken out of the lower triangular part (the main diagonal included) have maximum rank one. We will study how several matrix operations can be performed on such semiseparable, as well as more general rank structured matrices in an efficient and accurate way. We will illustrate the algorithms by means of several numerical examples. 1 Eigenvalue problems When all eigenvalues (and eigenvectors) have to be computed for a given symmetric matrix of not too large size, the classical QR-algorithm is used, i.e., the symmetric matrix is transformed into a symmetric tridiagonal one by Orthogonal Similarity transformations and then the QR-algorithm is applied to this tridiagonal matrix. This transformation is performed because each step of the QR-algorithm applied to an n × n tridiagonal matrix only requires O(n) flops. Moreover, the eigenvalues of the leading principal submatrices of the tridiagonal matrix are the Lanczos-Ritz values. One could ask the question if there are other classes of matrices having interesting structures that have similar properties as the tridiagonal matrices. In [9], the necessary and sufficient conditions are derived such that the k × k leading principal submatrix of the matrix during the Orthogonal Similarity transformation process contains the Lanczos-Ritz values as eigenvalues just as in the tridiagonal case. These conditions lead naturally to other classes of structured matrices (semiseparable and diagonal-plus-semiseparable matrices). Moreover it turns out that each step of the QR-algorithm also needs only O(n) operations. In this talk, we will give an overview of the use of (symmetric) semiseparable matrices as an alternative to tridiagonal ones. There is a close relationship between the set of tridiagonal and semiseparable matrices. Indeed, the inverse of a nonsingular tridiagonal matrix is semiseparable and vice-versa.

  • An Implicit Q Theorem for Hessenberg-like Matrices
    Mediterranean Journal of Mathematics, 2005
    Co-Authors: Raf Vandebril, Marc Van Barel, Nicola Mastronardi
    Abstract:

    The implicit Q theorem for Hessenberg matrices is a widespread and powerful theorem. It is used in the development of, for example, implicit QR algorithms to compute the eigendecomposition of Hessenberg matrices. Moreover it can also be used to prove the essential uniqueness of Orthogonal Similarity transformations of matrices to Hessenberg form. The theorem is also valid for symmetric tridiagonal matrices, proving thereby also in the symmetric case its power. Currently there is a growing interest to so-called semiseparable matrices. These matrices can be considered as the inverses of tridiagonal matrices. In a similar way, one can consider Hessenberg-like matrices as the inverses of Hessenberg matrices. In this paper, we formulate and prove an implicit Q theorem for the class of Hessenberg-like matrices. We introduce the notion of strongly unreduced Hessenberg-like matrices and also a method for transforming matrices via Orthogonal transformations to this form is proposed. Moreover, as the theorem is valid for Hessenberg-like matrices it is also valid for symmetric semiseparable matrices.

Marc Van Barel - One of the best experts on this subject based on the ideXlab platform.

  • On the convergence properties of the Orthogonal Similarity transformations to tridiagonal and semiseparable (plus diagonal) form
    Numerische Mathematik, 2006
    Co-Authors: Raf Vandebril, Marc Van Barel, Ellen Van Camp, Nicola Mastronardi
    Abstract:

    In this paper, we will compare the convergence properties of three basic reduction methods, by placing them in a general framework. It covers the reduction to tridiagonal, semiseparable and semiseparable plus diagonal form. These reductions are often used as the first step in the computation of the eigenvalues and/or eigenvectors of arbitrary matrices. In this way, the calculation of the eigenvalues using, for example, the QR-algorithm reduces in complexity. First we will investigate the convergence properties of these three reduction algorithms. It will be shown that for the partially reduced matrices at step k of any of these reduction algorithms, the lower right k  ×  k (already reduced) sub-block will have the Lanczos–Ritz values, w.r.t. a certain starting vector. It will also be shown that the reductions to semiseparable and to semiseparable plus diagonal form have an extra convergence behavior a special type of subspace iteration is performed on the lower right k  ×  k submatrix, which contains these Ritz-values. Secondly we look in more detail at the behavior of the involved subspace iteration. It will be shown that the reduction method can be interpreted as a nested type of multi-shift iteration. Theoretical results will be presented, making it possible to predict the convergence behavior of these reduction algorithms. Also a theoretical bound on the convergence rate is presented. Finally we illustrate by means of numerical examples, how it is possible to tune the convergence behavior such that it can become a powerful tool for certain applications.

  • Orthogonal Similarity transformation of a symmetric matrix into a diagonal plus semiseparable one with free choice of the diagonal
    Numerische Mathematik, 2006
    Co-Authors: Raf Vandebril, Marc Van Barel, Ellen Van Camp, Nicola Mastronardi
    Abstract:

    In this paper we describe an Orthogonal Similarity transformation for transforming arbitrary symmetric matrices into a diagonal-plus-semiseparable matrix, where we can freely choose the diagonal. Very recently an algorithm was proposed for transforming arbitrary symmetric matrices into similar semiseparable ones. This reduction is strongly connected to the reduction to tridiagonal form. The class of semiseparable matrices can be considered as a subclass of the diagonalplus- semiseparable matrices. Therefore we can interpret the proposed algorithm here as an extension of the reduction to semiseparable form. A numerical experiment is performed comparing thereby the accuracy of this reduction algorithm with respect to the accuracy of the traditional reduction to tridiagonal form, and the reduction to semiseparable form. The experiment indicates that all three reduction algorithms are equally accurate. Moreover it is shown in the experiments that asymptotically all the three approaches have the same complexity, i.e. that they have the same factor preceding the n 3 term in the computational complexity. Finally we illustrate that special choices of the diagonal create a specific convergence behavior.

  • Necessary and sufficient conditions for Orthogonal Similarity transformations to obtain the Arnoli(Lanczos)–Ritz values
    Linear Algebra and its Applications, 2006
    Co-Authors: Raf Vandebril, Marc Van Barel
    Abstract:

    Abstract It is a well-known fact that while reducing a symmetric matrix into a similar tridiagonal one, the already tridiagonal matrix in the partially reduced matrix has as eigenvalues the Lanczos–Ritz values. This behavior is also shared by the reduction algorithm which transforms symmetric matrices via Orthogonal Similarity transformations to semiseparable form. Moreover also the Orthogonal reduction to Hessenberg form has a similar behavior with respect to the Arnoldi–Ritz values. In this paper we investigate the Orthogonal Similarity transformations creating this behavior. Two easy conditions are derived, which provide necessary and sufficient conditions, such that the partially reduced matrices have the desired convergence behavior. The conditions are easy to check as they demand that in every step of the reduction algorithm two particular matrices need to have a zero block.

  • Rank structured matrix operations
    2006
    Co-Authors: Marc Van Barel, Raf Vandebril, Nicola Mastronardi, Steven Delvaux, Yvette Vanberghen
    Abstract:

    Classically, several numerical linear algebra problems are solved by means of tridiagonal (symmetric) and Hessenberg (unsymmetric) matrices. In this talk, it will be shown that a similar role can be played by semiseparable matrices. A matrix is called semiseparable if all submatrices that can be taken out of the lower triangular part (the main diagonal included) have maximum rank one. We will study how several matrix operations can be performed on such semiseparable, as well as more general rank structured matrices in an efficient and accurate way. We will illustrate the algorithms by means of several numerical examples. 1 Eigenvalue problems When all eigenvalues (and eigenvectors) have to be computed for a given symmetric matrix of not too large size, the classical QR-algorithm is used, i.e., the symmetric matrix is transformed into a symmetric tridiagonal one by Orthogonal Similarity transformations and then the QR-algorithm is applied to this tridiagonal matrix. This transformation is performed because each step of the QR-algorithm applied to an n × n tridiagonal matrix only requires O(n) flops. Moreover, the eigenvalues of the leading principal submatrices of the tridiagonal matrix are the Lanczos-Ritz values. One could ask the question if there are other classes of matrices having interesting structures that have similar properties as the tridiagonal matrices. In [9], the necessary and sufficient conditions are derived such that the k × k leading principal submatrix of the matrix during the Orthogonal Similarity transformation process contains the Lanczos-Ritz values as eigenvalues just as in the tridiagonal case. These conditions lead naturally to other classes of structured matrices (semiseparable and diagonal-plus-semiseparable matrices). Moreover it turns out that each step of the QR-algorithm also needs only O(n) operations. In this talk, we will give an overview of the use of (symmetric) semiseparable matrices as an alternative to tridiagonal ones. There is a close relationship between the set of tridiagonal and semiseparable matrices. Indeed, the inverse of a nonsingular tridiagonal matrix is semiseparable and vice-versa.

  • an Orthogonal Similarity reduction of a matrix into semiseparable form
    SIAM Journal on Matrix Analysis and Applications, 2005
    Co-Authors: Marc Van Barel, Raf Vandebril, Nicola Mastronardi
    Abstract:

    An algorithm to reduce a symmetric matrix to a similar semiseparable one of semiseparability rank 1, using Orthogonal Similarity transformations, is proposed in this paper. It is shown that, while running to completion, the proposed algorithm gives information on the spectrum of the similar initial matrix. In fact, the proposed algorithm shares the same properties of the Lanczos method and the Householder reduction to tridiagonal form. Furthermore, at each iteration, the proposed algorithm performs a step of the QR method without shift to a principal submatrix to retrieve the semiseparable structure. The latter step can be considered a kind of subspace-like iteration method, where the size of the subspace increases by one dimension at each step of the algorithm. Hence, when during the execution of the algorithm the Ritz values approximate the dominant eigenvalues closely enough, diagonal blocks will appear in the semiseparable part where the norm of the corresponding subdiagonal blocks goes to zero in the subsequent iteration steps, depending on the corresponding gap between the eigenvalues. A numerical experiment is included, illustrating the properties of the new algorithm.

Kaiming Zhao - One of the best experts on this subject based on the ideXlab platform.

  • NORMAL FORMS FOR Orthogonal Similarity CLASSES OF SKEW-SYMMETRIC MATRICES
    Journal of Algebra, 2007
    Co-Authors: Dragomir Ž. Ðoković, Konstanze Rietsch, Kaiming Zhao
    Abstract:

    Let F be an algebraically closed field of characteristic different from 2. Define the Orthogonal group, On(F), as the group of n by n matrices X over F such that XX′=In, where X′ is the transpose of X and In the identity matrix. We show that every nonsingular n by n skew-symmetric matrix over F is Orthogonally similar to a bidiagonal skew-symmetric matrix. In the singular case one has to allow some 4-diagonal blocks as well. If further the characteristic is 0, we construct the normal form for the On(F)-Similarity classes of skew-symmetric matrices. In this case, the known normal forms (as presented in the well-known book by Gantmacher) are quite different. Finally we study some related varieties of matrices. We prove that the variety of normalized nilpotent n by n bidiagonal matrices for n=2s+1 is irreducible of dimension s. As a consequence the skew-symmetric nilpotent n by n bidiagonal matrices are shown to form a variety of pure dimension s.

  • Normal forms for Orthogonal Similarity classes of skew-symmetric matrices
    arXiv: Representation Theory, 2006
    Co-Authors: Dragomir Z. Djokovic, Konstanze Rietsch, Kaiming Zhao
    Abstract:

    Let F be an algebraically closed field of characteristic different from 2. We show that every nonsingular skew-symmetric n by n matrix X over F is Orthogonally similar to a bidiagonal skew-symmetric matrix. In the singular case one has to allow some 4-diagonal blocks as well. If further the characteristic is 0, we construct the normal form for O_n(F)-Similarity classes of skew-symmetric matrices. In this case the known normal forms (as presented in the well known book by Gantmacher) are quite different. Finally we study some related varieties of matrices. We prove that the variety of normalized nilpotent n by n bidiagonal matrices for n=2s+1 is irreducible of dimension s. As a consequence the skew-symmetric nilpotent bidiagonal n by n matrices are shown to form a variety of pure dimension s.

  • Tridiagonal normal forms for Orthogonal Similarity classes of symmetric matrices
    Linear Algebra and its Applications, 2004
    Co-Authors: Dragomir Ž. Đoković, Kaiming Zhao
    Abstract:

    Abstract Let F be an algebraically closed field of characteristic different from 2. Define the Orthogonal group, O n ( F ), as the group of n by n matrices X over F such that XX ′ = I n , where X ′ is the transpose of X and I n the identity matrix. We show that every n by n symmetric matrix over F is Orthogonally similar to a tridiagonal symmetric matrix.If further the characteristic is 0, we construct the tridiagonal normal form for the O n ( F )-Similarity classes of symmetric matrices. We point out that, in this case, the known normal forms (as presented in the well known book by Gantmacher) are not tridiagonal.