Eigenvector

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 102228 Experts worldwide ranked by ideXlab platform

Nathan Noiry - One of the best experts on this subject based on the ideXlab platform.

  • Spectral Measures of Spiked Random Matrices
    Journal of Theoretical Probability, 2020
    Co-Authors: Nathan Noiry
    Abstract:

    We study two spiked models of random matrices under general frameworks corresponding, respectively, to additive deformation of random symmetric matrices and multiplicative perturbation of random covariance matrices. In both cases, the limiting spectral measure in the direction of an Eigenvector of the perturbation leads to old and new results on the coordinates of Eigenvectors.

Shaogang Gong - One of the best experts on this subject based on the ideXlab platform.

  • Spectral clustering with Eigenvector selection
    Pattern Recognition, 2008
    Co-Authors: Tao Xiang, Shaogang Gong
    Abstract:

    The task of discovering natural groupings of input patterns, or clustering, is an important aspect of machine learning and pattern analysis. In this paper, we study the widely used spectral clustering algorithm which clusters data using Eigenvectors of a similarity/affinity matrix derived from a data set. In particular, we aim to solve two critical issues in spectral clustering: (1) how to automatically determine the number of clusters, and (2) how to perform effective clustering given noisy and sparse data. An analysis of the characteristics of eigenspace is carried out which shows that (a) not every Eigenvectors of a data affinity matrix is informative and relevant for clustering; (b) Eigenvector selection is critical because using uninformative/irrelevant Eigenvectors could lead to poor clustering results; and (c) the corresponding eigenvalues cannot be used for relevant Eigenvector selection given a realistic data set. Motivated by the analysis, a novel spectral clustering algorithm is proposed which differs from previous approaches in that only informative/relevant Eigenvectors are employed for determining the number of clusters and performing clustering. The key element of the proposed algorithm is a simple but effective relevance learning method which measures the relevance of an Eigenvector according to how well it can separate the data set into different clusters. Our algorithm was evaluated using synthetic data sets as well as real-world data sets generated from two challenging visual learning problems. The results demonstrated that our algorithm is able to estimate the cluster number correctly and reveal natural grouping of the input data/patterns even given sparse and noisy data.

Kou-yuan Huang - One of the best experts on this subject based on the ideXlab platform.

  • Neural networks for seismic principal components analysis
    IEEE Transactions on Geoscience and Remote Sensing, 1999
    Co-Authors: Kou-yuan Huang
    Abstract:

    The neural network, using an unsupervised generalized Hebbian algorithm (GHA), is adopted to find the principal Eigenvectors of a covariance matrix in different kinds of seismograms. The authors have shown that the extensive computer results of the principal components analysis (PCA) using the neural net of GHA can extract the information of seismic reflection layers and uniform neighboring traces. The analyzed seismic data are the seismic traces with 20-, 25-, and 30-Hz Ricker wavelets, the fault, the reflection and diffraction patterns after normal moveout (NMO) correction, the bright spot pattern, and the real seismogram at Mississippi Canyon. The properties of high amplitude, low frequency, and polarity reversal can be shown from the projections on the principal Eigenvectors. For PCA, a theorem is proposed, which states that adding an extra point along the direction of the existing Eigenvector can enhance that Eigenvector. The theorem is applied to the interpretation of a fault seismogram and the uniform property of other seismograms. The PCA also provides a significant seismic data compression.

  • Neural computing for seismic principal components analysis
    IGARSS'97. 1997 IEEE International Geoscience and Remote Sensing Symposium Proceedings. Remote Sensing - A Scientific Vision for Sustainable Developme, 1997
    Co-Authors: Kou-yuan Huang
    Abstract:

    The neural network of the unsupervised generalized Hebbian algorithm (GHA) is adopted to find the principal Eigenvectors of a covariance matrix in different kinds of seismograms. The theorem about the effect of adding one extra point along the direction of the Eigenvector is proposed to help the interpretations that more uniform data vectors along one principal Eigenvector direction can enhance the eigenvalue. Diffraction pattern, fault pattern, bright spot pattern and real seismograms are in the experiments. From analyses the principal components can show the high amplitude, polarity reversal, and low frequency wavelet in the detection of seismic anomalies and can improve seismic interpretations.

  • IJCNN - Neural network for seismic principal components analysis
    IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339), 1
    Co-Authors: Kou-yuan Huang
    Abstract:

    The neural network using an unsupervised generalized Hebbian algorithm (GHA) is adopted to find the principal Eigenvectors of a covariance matrix in different kinds of seismograms. We have shown that the extensive computer results of the principal components analysis (PCA) using neural net of GHA can extract the information of seismic reflection layers and uniform neighboring traces. The analyzed seismic data are the seismic traces with 20, 25, and 30 Hz Ricker wavelets, the fault, the reflection and diffraction patterns after NMO correction, the bright spot pattern, and the real seismogram at Mississippi Canyon. The properties of high amplitude, low frequency, and polarity reversal can be shown from the projections on the principal Eigenvectors. For PCA, a theorem is proposed that adding extra point along the direction of the existing Eigenvector can enhance that Eigenvector. The theorem is applied to the interpretation of a fault seismogram and the uniform property of other seismograms. The PCA also provides a significant seismic data compression.

Pradipta Mitra - One of the best experts on this subject based on the ideXlab platform.

  • spectral clustering by recursive partitioning
    European Symposium on Algorithms, 2006
    Co-Authors: A Dasgupta, John E Hopcroft, Ravi Kannan, Pradipta Mitra
    Abstract:

    In this paper, we analyze the second Eigenvector technique of spectral partitioning on the planted partition random graph model, by constructing a recursive algorithm using the second Eigenvectors in order to learn the planted partitions. The correctness of our algorithm is not based on the ratio-cut interpretation of the second Eigenvector, but exploits instead the stability of the Eigenvector subspace. As a result, we get an improved cluster separation bound in terms of dependence on the maximum variance. We also extend our results for a clustering problem in the case of sparse graphs.

Maoguo Gong - One of the best experts on this subject based on the ideXlab platform.

  • spectral clustering with Eigenvector selection based on entropy ranking
    Neurocomputing, 2010
    Co-Authors: Feng Zhao, Licheng Jiao, Hanqiang Liu, Xinbo Gao, Maoguo Gong
    Abstract:

    Ng-Jordan-Weiss (NJW) method is one of the most widely used spectral clustering algorithms. For a K clustering problem, this method partitions data using the largest K Eigenvectors of the normalized affinity matrix derived from the dataset. It has been demonstrated that the spectral relaxation solution of K-way grouping is located on the subspace of the largest K Eigenvectors. However, we find from a lot of experiments that the top K Eigenvectors cannot always detect the structure of the data for real pattern recognition problems. So it is necessary to select Eigenvectors for spectral clustering. We propose an Eigenvector selection method based on entropy ranking for spectral clustering (ESBER). In this method, first all the Eigenvectors are ranked according to their importance on clustering, and then a suitable Eigenvector combination is obtained from the ranking list. In this paper, we propose two strategies to select Eigenvectors in the ranking list of Eigenvectors. One is directly adopting the first K Eigenvectors in the ranking list. Different to the largest K Eigenvectors of NJW method, these K Eigenvectors are the most important Eigenvectors among all the Eigenvectors. The other Eigenvector selection strategy is to search a suitable Eigenvector combination among the first Km (Km>K) Eigenvectors in the ranking list. The Eigenvector combination obtained by this strategy can reflect the structure of the original data and lead to a satisfying spectral clustering result. Furthermore, we also present computational complexity reduction strategies for ESBER method to deal with large-scale datasets. We have performed experiments on UCI benchmark datasets, MNIST handwritten digits datasets, and Brodatz texture datasets, adopting NJW method for a baseline comparison. The experimental results show that ESBER method is more robust than NJW method. Especially, ESBER method with the latter Eigenvector selection strategy can obtain satisfying clustering results in most cases.