Principal Component Analysis

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 408645 Experts worldwide ranked by ideXlab platform

Wei Dong Zhao - One of the best experts on this subject based on the ideXlab platform.

  • Incremental Tensor Principal Component Analysis for Handwritten Digit Recognition
    Mathematical Problems in Engineering, 2014
    Co-Authors: Wei Dong Zhao, Dan Li, Jiliu Zhou
    Abstract:

    To overcome the shortcomings of traditional dimensionality reduction algorithms, incremental tensor Principal Component Analysis (ITPCA) based on updated-SVD technique algorithm is proposed in this paper. This paper proves the relationship between PCA, 2DPCA, MPCA, and the graph embedding framework theoretically and derives the incremental learning procedure to add single sample and multiple samples in detail. The experiments on handwritten digit recognition have demonstrated that ITPCA has achieved better recognition performance than that of vector-based Principal Component Analysis (PCA), incremental Principal Component Analysis (IPCA), and multilinear Principal Component Analysis (MPCA) algorithms. At the same time, ITPCA also has lower time and space complexity.

  • Incremental Tensor Principal Component Analysis for Image Recognition
    Advanced Materials Research, 2013
    Co-Authors: Wei Dong Zhao
    Abstract:

    Aiming at the disadvantages of the traditional off-line vector-based learning algorithm, this paper proposes a kind of Incremental Tensor Principal Component Analysis (ITPCA) algorithm. It represents an image as a tensor data and processes incremental Principal Component Analysis learning based on update-SVD technique. On the one hand, the proposed algorithm is helpful to preserve the structure information of the image. On the other hand, it solves the training problem for new samples. The experiments on handwritten numeral recognition have demonstrated that the algorithm has achieved better performance than traditional vector-based Incremental Principal Component Analysis (IPCA) and Multi-linear Principal Component Analysis (MPCA) algorithms.

Yang Liu - One of the best experts on this subject based on the ideXlab platform.

  • joint sparse Principal Component Analysis
    Pattern Recognition, 2017
    Co-Authors: Zhihui Lai, Yiuming Cheung, Yang Liu
    Abstract:

    Abstract Principal Component Analysis (PCA) is widely used in dimensionality reduction. A lot of variants of PCA have been proposed to improve the robustness of the algorithm. However, the existing methods either cannot select the useful features consistently or is still sensitive to outliers, which will depress their performance of classification accuracy. In this paper, a novel approach called joint sparse Principal Component Analysis (JSPCA) is proposed to jointly select useful features and enhance robustness to outliers. In detail, JSPCA relaxes the orthogonal constraint of transformation matrix to make it have more freedom to jointly select useful features for low-dimensional representation. JSPCA imposes joint sparse constraints on its objective function, i.e., l 2 , 1 -norm is imposed on both the loss term and the regularization term, to improve the algorithmic robustness. A simple yet effective optimization solution is presented and the theoretical analyses of JSPCA are provided. The experimental results on eight data sets demonstrate that the proposed approach is feasible and effective.

Shuicheng Yan - One of the best experts on this subject based on the ideXlab platform.

  • inductive robust Principal Component Analysis
    IEEE Transactions on Image Processing, 2012
    Co-Authors: Bingkun Bao, Guangcan Liu, Shuicheng Yan
    Abstract:

    In this paper, we address the error correction problem, that is, to uncover the low-dimensional subspace structure from high-dimensional observations, which are possibly corrupted by errors. When the errors are of Gaussian distribution, Principal Component Analysis (PCA) can find the optimal (in terms of least-square error) low-rank approximation to high-dimensional data. However, the canonical PCA method is known to be extremely fragile to the presence of gross corruptions. Recently, Wright established a so-called robust Principal Component Analysis (RPCA) method, which can well handle the grossly corrupted data. However, RPCA is a transductive method and does not handle well the new samples, which are not involved in the training procedure. Given a new datum, RPCA essentially needs to recalculate over all the data, resulting in high computational cost. So, RPCA is inappropriate for the applications that require fast online computation. To overcome this limitation, in this paper, we propose an inductive robust Principal Component Analysis (IRPCA) method. Given a set of training data, unlike RPCA that targets on recovering the original data matrix, IRPCA aims at learning the underlying projection matrix, which can be used to efficiently remove the possible corruptions in any datum. The learning is done by solving a nuclear-norm regularized minimization problem, which is convex and can be solved in polynomial time. Extensive experiments on a benchmark human face dataset and two video surveillance datasets show that IRPCA cannot only be robust to gross corruptions, but also handle the new data well and in an efficient way.

M R Oliveira - One of the best experts on this subject based on the ideXlab platform.

  • algorithms for projection pursuit robust Principal Component Analysis
    Chemometrics and Intelligent Laboratory Systems, 2007
    Co-Authors: Christophe Croux, Peter Filzmoser, M R Oliveira
    Abstract:

    Principal Component Analysis (PCA) is very sensitive in presence of outliers. One of the most appealing robust methods for Principal Component Analysis uses the Projection-Pursuit principle. Here, one projects the data on a lower-dimensional space such that a robust measure of variance of the projected data will be maximized. The Projection-Pursuit based method for Principal Component Analysis has recently been introduced in the field of chemometrics, where the number of variables is typically large. In this paper, it is shown that the currently available algorithm for robust Projection-Pursuit PCA performs poor in presence of many variables. A new algorithm is proposed that is more suitable for the Analysis of chemical data. Its performance is studied by means of simulation experiments and illustrated on some real datasets.

Christopher M. Bishop - One of the best experts on this subject based on the ideXlab platform.

  • Probabilistic Principal Component Analysis
    Journal of the Royal Statistical Society. Series B: Statistical Methodology, 1999
    Co-Authors: Michael E. Tipping, Christopher M. Bishop
    Abstract:

    Principal Component Analysis (PCA) is a ubiquitous technique for data Analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the Principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor Analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the Principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.