Identity Parameter

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 10374 Experts worldwide ranked by ideXlab platform

M Savvides - One of the best experts on this subject based on the ideXlab platform.

  • individual kernel tensor subspaces for robust face recognition a computationally efficient tensor framework without requiring mode factorization
    Systems Man and Cybernetics, 2007
    Co-Authors: Sung Won Park, M Savvides
    Abstract:

    Facial images change appearance due to multiple factors such as different poses, lighting variations, and facial expressions. Tensors are higher order extensions of vectors and matrices, which make it possible to analyze different appearance factors of facial variation. Using higher order tensors, we can construct a multilinear structure and model the multiple factors of face variation. In particular, among the appearance factors, the factor of a person's Identity modeled by a tensor structure can be used for face recognition. However, this tensor-based face recognition creates difficulty in factorizing the unknown Parameters of a new test image and solving for the person-Identity Parameter. In this paper, to break this limitation of applying the tensor-based methods to face recognition, we propose a novel tensor approach based on an individual-modeling method and nonlinear mappings. The proposed method does not require the problematic tensor factorization and is more efficient than the traditional TensorFaces method with respect to computation and memory. We set up the problem of solving for the unknown factors as a least squares problem with a quadratic equality constraint and solve it using numerical optimization techniques. We show that an individual-multilinear approach reduces the order of the tensor so that it makes face-recognition tasks computationally efficient as well as analytically simpler. We also show that nonlinear kernel mappings can be applied to this optimization problem and provide more accuracy to face-recognition systems than linear mappings. In this paper, we show that the proposed method, individual kernel TensorFaces, produces the better discrimination power for classification. The novelty in our approach as compared to previous work is that the Individual Kernel TensorFaces method does not require estimating any factor of a new test image for face recognition. In addition, we do not need to have any a priori knowledge of or assumption about the factors of a test image when using the proposed method. We can apply individual kernel TensorFaces even if the factors of a test image are absent from the training set. Based on various experiments on the Carnegie Mellon University Pose, Illumination, and Expression database, we demonstrate that the proposed method produces reliable results for face recognition.

Sung Won Park - One of the best experts on this subject based on the ideXlab platform.

  • individual kernel tensor subspaces for robust face recognition a computationally efficient tensor framework without requiring mode factorization
    Systems Man and Cybernetics, 2007
    Co-Authors: Sung Won Park, M Savvides
    Abstract:

    Facial images change appearance due to multiple factors such as different poses, lighting variations, and facial expressions. Tensors are higher order extensions of vectors and matrices, which make it possible to analyze different appearance factors of facial variation. Using higher order tensors, we can construct a multilinear structure and model the multiple factors of face variation. In particular, among the appearance factors, the factor of a person's Identity modeled by a tensor structure can be used for face recognition. However, this tensor-based face recognition creates difficulty in factorizing the unknown Parameters of a new test image and solving for the person-Identity Parameter. In this paper, to break this limitation of applying the tensor-based methods to face recognition, we propose a novel tensor approach based on an individual-modeling method and nonlinear mappings. The proposed method does not require the problematic tensor factorization and is more efficient than the traditional TensorFaces method with respect to computation and memory. We set up the problem of solving for the unknown factors as a least squares problem with a quadratic equality constraint and solve it using numerical optimization techniques. We show that an individual-multilinear approach reduces the order of the tensor so that it makes face-recognition tasks computationally efficient as well as analytically simpler. We also show that nonlinear kernel mappings can be applied to this optimization problem and provide more accuracy to face-recognition systems than linear mappings. In this paper, we show that the proposed method, individual kernel TensorFaces, produces the better discrimination power for classification. The novelty in our approach as compared to previous work is that the Individual Kernel TensorFaces method does not require estimating any factor of a new test image for face recognition. In addition, we do not need to have any a priori knowledge of or assumption about the factors of a test image when using the proposed method. We can apply individual kernel TensorFaces even if the factors of a test image are absent from the training set. Based on various experiments on the Carnegie Mellon University Pose, Illumination, and Expression database, we demonstrate that the proposed method produces reliable results for face recognition.

Kuljeet Kaur - One of the best experts on this subject based on the ideXlab platform.

  • Networks Domain
    2013
    Co-Authors: Kuljeet Kaur
    Abstract:

    Whenever there is communication between Client and Server over a public link and resources are to be accessed from remote systems, then proving an Identity becomes quiet complex because there is need of proper access rights with authentication. Complete security at the transport layer starts with proof of authentication, majority organizations only use password for security but this research paper would include one more tier of security to the transport layer security protocol by using fingerprints for Identity authentication. Bio Hashing with the help of Minutiae Points at the fingerprints would be used for mutual authentication. Complete comparative analysis of all the existing password authentication schemes on the basis of security requirements and attacks is done in this paper. Result is generated that which existing scheme could withstand security requirement of mutual authentication for using fingerprint as Identity Parameter along with password. Proof is generated that with mutual authentication intruders could not practice Phishing, IP or Server Spoofing, Smurf attack and DNS Poisoning etc. Research paper focuses on implementing Password and Fingerprints for mutual authentication in Multi Server environment which will generate an Ideal Password Authentication Scheme and will result in fortification o

Chiyuan Miao - One of the best experts on this subject based on the ideXlab platform.

  • a comprehensive evaluation of various sensitivity analysis methods a case study with a hydrological model
    Environmental Modelling and Software, 2014
    Co-Authors: Yanjun Gan, Qingyun Duan, Wei Gong, Charles Tong, Yunwei Sun, Wei Chu, Chiyuan Miao
    Abstract:

    Sensitivity analysis (SA) is a commonly used approach for identifying important Parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) Parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable Parameters, is used for illustration. The South Branch Potomac River basin near Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important Parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400-600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to Identity Parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LP"@t (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.

Yanjun Gan - One of the best experts on this subject based on the ideXlab platform.

  • a comprehensive evaluation of various sensitivity analysis methods a case study with a hydrological model
    Environmental Modelling and Software, 2014
    Co-Authors: Yanjun Gan, Qingyun Duan, Wei Gong, Charles Tong, Yunwei Sun, Wei Chu, Chiyuan Miao
    Abstract:

    Sensitivity analysis (SA) is a commonly used approach for identifying important Parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) Parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable Parameters, is used for illustration. The South Branch Potomac River basin near Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important Parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400-600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to Identity Parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LP"@t (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.