Neighbor Classifier

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 11310 Experts worldwide ranked by ideXlab platform

Jian Yang - One of the best experts on this subject based on the ideXlab platform.

  • k local hyperplane distance nearest Neighbor Classifier oriented local discriminant analysis
    Information Sciences, 2013
    Co-Authors: Jie Xu, Jian Yang
    Abstract:

    K-local hyperplane distance nearest Neighbor (HKNN) Classifier is an improved K-nearest Neighbor (KNN) algorithm that has been successfully applied to pattern classification. This paper embeds the decision rule of HKNN Classifier into the discriminant analysis model to develop a new feature extractor. The obtained feature extractor is called K-local hyperplane distance nearest Neighbor Classifier oriented local discriminant analysis (HOLDA), in which a regularization item is imposed on the original HKNN algorithm to obtain a more reliable distance metric. Based on this distance metric, the homo-class and hetero-class local scatters are characterized in HOLDA. By maximizing the ratio of the hetero-class local scatter to the homo-class local scatter, we obtain a subspace which is suitable for feature extraction and classification. In general, this paper provides a framework for building a feature extractor from the decision rule of a Classifier. By this means, the feature extractor and Classifier can be seamlessly integrated. Experimental results on four databases demonstrate that the integrated pattern recognition system is effective.

  • k local hyperplane distance nearest Neighbor Classifier oriented local discriminant analysis
    Information Sciences, 2013
    Co-Authors: Jie Xu, Jian Yang
    Abstract:

    K-local hyperplane distance nearest Neighbor (HKNN) Classifier is an improved K-nearest Neighbor (KNN) algorithm that has been successfully applied to pattern classification. This paper embeds the decision rule of HKNN Classifier into the discriminant analysis model to develop a new feature extractor. The obtained feature extractor is called K-local hyperplane distance nearest Neighbor Classifier oriented local discriminant analysis (HOLDA), in which a regularization item is imposed on the original HKNN algorithm to obtain a more reliable distance metric. Based on this distance metric, the homo-class and hetero-class local scatters are characterized in HOLDA. By maximizing the ratio of the hetero-class local scatter to the homo-class local scatter, we obtain a subspace which is suitable for feature extraction and classification. In general, this paper provides a framework for building a feature extractor from the decision rule of a Classifier. By this means, the feature extractor and Classifier can be seamlessly integrated. Experimental results on four databases demonstrate that the integrated pattern recognition system is effective.

Jie Xu - One of the best experts on this subject based on the ideXlab platform.

  • k local hyperplane distance nearest Neighbor Classifier oriented local discriminant analysis
    Information Sciences, 2013
    Co-Authors: Jie Xu, Jian Yang
    Abstract:

    K-local hyperplane distance nearest Neighbor (HKNN) Classifier is an improved K-nearest Neighbor (KNN) algorithm that has been successfully applied to pattern classification. This paper embeds the decision rule of HKNN Classifier into the discriminant analysis model to develop a new feature extractor. The obtained feature extractor is called K-local hyperplane distance nearest Neighbor Classifier oriented local discriminant analysis (HOLDA), in which a regularization item is imposed on the original HKNN algorithm to obtain a more reliable distance metric. Based on this distance metric, the homo-class and hetero-class local scatters are characterized in HOLDA. By maximizing the ratio of the hetero-class local scatter to the homo-class local scatter, we obtain a subspace which is suitable for feature extraction and classification. In general, this paper provides a framework for building a feature extractor from the decision rule of a Classifier. By this means, the feature extractor and Classifier can be seamlessly integrated. Experimental results on four databases demonstrate that the integrated pattern recognition system is effective.

  • k local hyperplane distance nearest Neighbor Classifier oriented local discriminant analysis
    Information Sciences, 2013
    Co-Authors: Jie Xu, Jian Yang
    Abstract:

    K-local hyperplane distance nearest Neighbor (HKNN) Classifier is an improved K-nearest Neighbor (KNN) algorithm that has been successfully applied to pattern classification. This paper embeds the decision rule of HKNN Classifier into the discriminant analysis model to develop a new feature extractor. The obtained feature extractor is called K-local hyperplane distance nearest Neighbor Classifier oriented local discriminant analysis (HOLDA), in which a regularization item is imposed on the original HKNN algorithm to obtain a more reliable distance metric. Based on this distance metric, the homo-class and hetero-class local scatters are characterized in HOLDA. By maximizing the ratio of the hetero-class local scatter to the homo-class local scatter, we obtain a subspace which is suitable for feature extraction and classification. In general, this paper provides a framework for building a feature extractor from the decision rule of a Classifier. By this means, the feature extractor and Classifier can be seamlessly integrated. Experimental results on four databases demonstrate that the integrated pattern recognition system is effective.

Mallikarjun Hangarge - One of the best experts on this subject based on the ideXlab platform.

  • recognition of isolated handwritten kannada numerals based on image fusion method
    Pattern Recognition and Machine Intelligence, 2007
    Co-Authors: G G Rajput, Mallikarjun Hangarge
    Abstract:

    This paper describes a system for isolated Kannada handwritten numerals recognition using image fusion method. Several digital images corresponding to each handwritten numeral are fused to generate patterns, which are stored in 8×8 matrices, irrespective of the size of images. The numerals to be recognized are matched using nearest Neighbor Classifier with each pattern and the best match pattern is considered as the recognized numeral.The experimental results show accuracy of 96.2% for 500 images, representing the portion of trained data, with the system being trained for 1000 images. The recognition result of 91% was obtained for 250 test numerals other than the trained images. Further to test the performance of the proposed scheme 4-fold cross validation has been carried out yielding an accuracy of 89%.

  • handwritten kannada numeral recognition based on structural features
    Computational Intelligence, 2007
    Co-Authors: B V Dhandra, R G Benne, Mallikarjun Hangarge
    Abstract:

    This paper deals with the automatic recognition of handwritten Isolated Kannada numerals based on structural features. Four different types of structural features namely, directional density of pixels in four directions, water reservoirs, maximum profile distances, and fill hole density are used for the recognition of numerals. A Minkowski minimum distance criteria is used to find minimum distances and K-nearest Neighbor Classifier is used to classify the Kannada numerals. A total 1512 numeral images are tested, and the overall accuracy is found to be 96.12 %. The novelty of the proposed method is that it is thinning free, fast and writer style independent.

Fan Zhang - One of the best experts on this subject based on the ideXlab platform.

  • collaborative representation based nearest Neighbor Classifier for hyperspectral imagery
    IEEE Geoscience and Remote Sensing Letters, 2015
    Co-Authors: Fan Zhang
    Abstract:

    Novel collaborative representation (CR)-based nearest Neighbor (NN) algorithms are proposed for hyperspectral image classification. The proposed methods are based on a CR computed by an l 2 -norm minimization with a Tikhonov regularization matrix. More specific, a testing sample is represented as a linear combination of all the training samples, and the weights for representation are estimated by an l 2 -norm minimization-derived closed-form solution. In the first strategy, the label of a testing sample is determined by majority voting of those with k largest representation weights. In the second strategy, local within-class CR is considered as an alternative, and the testing sample is assigned to the class producing the minimum representation residual. The experimental results show that the proposed algorithms achieve better performance than several previous algorithms, such as the original k-NN Classifier and the local mean-based NN Classifier.

Jianping Gou - One of the best experts on this subject based on the ideXlab platform.

  • a generalized mean distance based k nearest Neighbor Classifier
    Expert Systems With Applications, 2019
    Co-Authors: Jianping Gou, Shaoning Zeng, Yunbo Rao, Hebiao Yang
    Abstract:

    Abstract K-nearest Neighbor (KNN) rule is a well-known non-parametric Classifier that is widely used in pattern recognition. However, the sensitivity of the Neighborhood size k always seriously degrades the KNN-based classification performance, especially in the case of the small sample size with the existing outliers. To overcome this issue, in this article we propose a generalized mean distance-based k-nearest Neighbor Classifier (GMDKNN) by introducing multi-generalized mean distances and the nested generalized mean distance that are based on the characteristic of the generalized mean. In the proposed method, multi-local mean vectors of the given query sample in each class are calculated by adopting its class-specific k nearest Neighbors. Using the achieved k local mean vectors per class, the corresponding k generalized mean distances are calculated and then used to design the categorical nested generalized mean distance. In the classification phase, the categorical nested generalized mean distance is used as the classification decision rule and the query sample is classified into the class with the minimum nested generalized mean distance among all the classes. Extensive experiments on the UCI and KEEL data sets, synthetic data sets, the KEEL noise data sets and the UCR time series data sets are conducted by comparing the proposed method to the state-of-art KNN-based methods. The experimental results demonstrate that the proposed GMDKNN performs better and has the less sensitiveness to k. Thus, our proposed GMDKNN with the robust and effective classification performance could be a promising method for pattern recognition in some expert and intelligence systems.

  • a local mean based k nearest centroid Neighbor Classifier
    The Computer Journal, 2012
    Co-Authors: Jianping Gou, Taisong Xiong
    Abstract:

    K-nearest Neighbor (KNN) rule is a simple and effective algorithm in pattern classification. In this article, we propose a local mean-based k-nearest centroid Neighbor Classifier that assigns to each query pattern a class label with nearest local centroid mean vector so as to improve the classification performance. The proposed scheme not only takes into account the proximity and spatial distribution of k Neighbors, but also utilizes the local mean vector of k Neighbors from each class in making classification decision. In the proposed Classifier, a local mean vector of k nearest centroid Neighbors from each class for a query pattern is well positioned to sufficiently capture the class distribution information. In order to investigate the classification behavior of the proposed Classifier, we conduct extensive experiments on the real and synthetic data sets in terms of the classification error. Experimental results demonstrate that our proposed method performs significantly well, particularly in the small sample size cases, compared with the state-of-the-art KNN-based algorithms.