Kernel Parameter

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 26331 Experts worldwide ranked by ideXlab platform

Shigeo Abe - One of the best experts on this subject based on the ideXlab platform.

  • ESANN - Optimizing Kernel Parameters by Second-Order Methods
    2007
    Co-Authors: Shigeo Abe
    Abstract:

    Radial basis function network (RBF) Kernels are widely used for support vector machines (SVMs). But for model selection of an SVM, we need to optimize the Kernel Parameter and the margin Parameter by time-consuming cross validation. In this paper we propose determining Parameters for RBF and Mahalanobis Kernels by maximizing the class separability by the second-order optimization. For multi-class problems, we determine the Kernel Parameters for all the two-class problems and set the average value of the Parameter values to all the Kernel parame- ters. Then we determine the margin Parameter by cross-validation. By computer experiments of multi-class problems we show that the proposed method works to select optimal or near optimal Parameters.

  • ANNPR - Support vector regression using mahalanobis Kernels
    Artificial Neural Networks in Pattern Recognition, 2006
    Co-Authors: Yuya Kamada, Shigeo Abe
    Abstract:

    In our previous work we have shown that Mahalanobis Kernels are useful for support vector classifiers both from generalization ability and model selection speed. In this paper we propose using Mahalanobis Kernels for function approximation. We determine the covariance matrix for the Mahalanobis Kernel using all the training data. Model selection is done by line search. Namely, first the margin Parameter and the error threshold are optimized and then the Kernel Parameter is optimized. According to the computer experiments for four benchmark problems, estimation performance of a Mahalanobis Kernel with a diagonal covariance matrix optimized by line search is comparable to or better than that of an RBF Kernel optimized by grid search.

  • ICANN (2) - Training of support vector machines with Mahalanobis Kernels
    Lecture Notes in Computer Science, 2005
    Co-Authors: Shigeo Abe
    Abstract:

    Radial basis function (RBF) Kernels are widely used for support vector machines. But for model selection, we need to optimize the Kernel Parameter and the margin Parameter by time-consuming cross validation. To solve this problem, in this paper we propose using Mahalanobis Kernels, which are generalized RBF Kernels. We determine the covariance matrix for the Mahalanobis Kernel using the training data corresponding to the associated classes. Model selection is done by line search. Namely, first the margin Parameter is optimized and then the Mahalanobis Kernel Parameter is optimized. According to the computer experiments for two-class problems, a Mahalanobis Kernel with a diagonal covariance matrix shows better generalization ability than a Mahalanobis Kernel with a full covariance matrix, and a Mahalanobis Kernel optimized by line search shows comparable performance with that with an RBF Kernel optimized by grid search.

Wang Sheng-chang - One of the best experts on this subject based on the ideXlab platform.

  • Improved RBF-SVM Based on Genetic Algorithm and Its Applications
    Computer Simulation, 2008
    Co-Authors: Wang Sheng-chang
    Abstract:

    The character of RBF Kernel in support vector machine was discussed, and a conclusion was drawn that the generalization ability of support vector machine could be improved by giving larger Kernel Parameters to those features useless for the classification problem to lower their influence on Kernel function. On the basis of this conclusion, an improved multi-Kernel-Parameter support vector machine with RBF Kernel based on genetic algorithm was proposed, where genetic algorithm was applied to find optimum Kernel Parameters by minimizing validation error. Experiment results of rolling bearing fault diagnosis show that the improved multi-Kernel-Parameter support vector machine possesses better generalization ability than conventional support vector machine does, and the Kernel Parameters directly reflect the classification ability of corresponding features.

Rongho Lin - One of the best experts on this subject based on the ideXlab platform.

  • a novel hybrid genetic algorithm for Kernel function and Parameter optimization in support vector regression
    Expert Systems With Applications, 2009
    Co-Authors: Gwohshiung Tzeng, Rongho Lin
    Abstract:

    This study developed a novel model, HGA-SVR, for type of Kernel function and Kernel Parameter value optimization in support vector regression (SVR), which is then applied to forecast the maximum electrical daily load. A novel hybrid genetic algorithm (HGA) was adapted to search for the optimal type of Kernel function and Kernel Parameter values of SVR to increase the accuracy of SVR. The proposed model was tested at an electricity load forecasting competition announced on the EUNITE network. The results showed that the new HGA-SVR model outperforms the previous models. Specifically, the new HGA-SVR model can successfully identify the optimal type of Kernel function and all the optimal values of the Parameters of SVR with the lowest prediction error values in electricity load forecasting.

Yaonan Wang - One of the best experts on this subject based on the ideXlab platform.

  • texture classification using the support vector machines
    Pattern Recognition, 2003
    Co-Authors: James T Kwok, Hailong Zhu, Yaonan Wang
    Abstract:

    In recent years, support vector machines (SVMs) have demonstrated excellent performance in a variety of pattern recognition problems. In this paper, we apply SVMs for texture classification, using translation-invariant features generated from the discrete wavelet frame transform. To alleviate the problem of selecting the right Kernel Parameter in the SVM, we use a fusion scheme based on multiple SVMs, each with a different setting of the Kernel Parameter. Compared to the traditional Bayes classifier and the learning vector quantization algorithm, SVMs, and, in particular, the fused output from multiple SVMs, produce more accurate classification results on the Brodatz texture album.

Hailong Zhu - One of the best experts on this subject based on the ideXlab platform.

  • texture classification using the support vector machines
    Pattern Recognition, 2003
    Co-Authors: James T Kwok, Hailong Zhu, Yaonan Wang
    Abstract:

    In recent years, support vector machines (SVMs) have demonstrated excellent performance in a variety of pattern recognition problems. In this paper, we apply SVMs for texture classification, using translation-invariant features generated from the discrete wavelet frame transform. To alleviate the problem of selecting the right Kernel Parameter in the SVM, we use a fusion scheme based on multiple SVMs, each with a different setting of the Kernel Parameter. Compared to the traditional Bayes classifier and the learning vector quantization algorithm, SVMs, and, in particular, the fused output from multiple SVMs, produce more accurate classification results on the Brodatz texture album.