Negative Correlation

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 334374 Experts worldwide ranked by ideXlab platform

Huanhuan Chen - One of the best experts on this subject based on the ideXlab platform.

  • Semisupervised Negative Correlation Learning
    IEEE Transactions on Neural Networks and Learning Systems, 2018
    Co-Authors: Huanhuan Chen, Bingbing Jiang
    Abstract:

    Negative Correlation learning (NCL) is an ensemble learning algorithm that introduces a Correlation penalty term to the cost function of each individual ensemble member. Each ensemble member minimizes its mean square error and its error Correlation with the rest of the ensemble. This paper analyzes NCL and reveals that adopting a Negative Correlation term for unlabeled data is beneficial to improving the model performance in the semisupervised learning (SSL) setting. We then propose a novel SSL algorithm, Semisupervised NCL (SemiNCL) algorithm. The algorithm considers the Negative Correlation terms for both labeled and unlabeled data for the semisupervised problems. In order to reduce the computational and memory complexity, an accelerated SemiNCL is derived from the distributed least square algorithm. In addition, we have derived a bound for two parameters in SemiNCL based on an analysis of the Hessian matrix of the error function. The new algorithm is evaluated by extensive experiments with various ratios of labeled and unlabeled training data. Comparisons with other state-of-the-art supervised and semisupervised algorithms confirm that SemiNCL achieves the best overall performance.

  • Ensemble Learning by Negative Correlation Learning
    Ensemble Machine Learning, 2012
    Co-Authors: Huanhuan Chen, Anthony G. Cohn
    Abstract:

    This chapter investigates a specific ensemble learning approach by Negative Correlation learning (NCL) [21, 22, 23]. NCL is an ensemble learning algorithm which considers the cooperation and interaction among the ensemble members. NCL introduces a Correlation penalty term into the cost function of each individual learner so that each learner minimizes its mean-square-error (MSE) error together with the Correlation with other ensemble members.

  • Negative Correlation learning for classification ensembles
    The 2010 International Joint Conference on Neural Networks (IJCNN), 2010
    Co-Authors: Shuo Wang, Huanhuan Chen
    Abstract:

    This paper proposes a new Negative Correlation learning (NCL) algorithm, called AdaBoost.NC, which uses an ambiguity term derived theoretically for classification ensembles to introduce diversity explicitly. All existing NCL algorithms, such as CELS and NCCD, and their theoretical backgrounds were studied in the regression context. We focus on classification problems in this paper. First, we study the ambiguity decomposition with the 0-1 error function, which is different from the one proposed by Krogh et al.. It is applicable to both binary-class and multi-class problems. Then, to overcome the identified drawbacks of the existing algorithms, AdaBoost.NC is proposed by exploiting the ambiguity term in the decomposition to improve diversity. Comprehensive experiments are performed on a collection of benchmark data sets. The results show AdaBoost.NC is a promising algorithm to solve classification problems, which gives better performance than the standard AdaBoost and NCCD, and consumes much less computation time than CELS.

  • IJCNN - Negative Correlation learning for classification ensembles
    The 2010 International Joint Conference on Neural Networks (IJCNN), 2010
    Co-Authors: Shuo Wang, Huanhuan Chen
    Abstract:

    This paper proposes a new Negative Correlation learning (NCL) algorithm, called AdaBoost.NC, which uses an ambiguity term derived theoretically for classification ensembles to introduce diversity explicitly. All existing NCL algorithms, such as CELS [1] and NCCD [2], and their theoretical backgrounds were studied in the regression context. We focus on classification problems in this paper. First, we study the ambiguity decomposition with the 0–1 error function, which is different from the one proposed by Krogh et al. [3]. It is applicable to both binary-class and multi-class problems. Then, to overcome the identified drawbacks of the existing algorithms, AdaBoost.NC is proposed by exploiting the ambiguity term in the decomposition to improve diversity. Comprehensive experiments are performed on a collection of benchmark data sets. The results show AdaBoost.NC is a promising algorithm to solve classification problems, which gives better performance than the standard AdaBoost and NCCD, and consumes much less computation time than CELS.

  • Multiobjective Neural Network Ensembles Based on Regularized Negative Correlation Learning
    IEEE Transactions on Knowledge and Data Engineering, 2010
    Co-Authors: Huanhuan Chen
    Abstract:

    Negative Correlation Learning (NCL) [CHECK END OF SENTENCE], [CHECK END OF SENTENCE] is a neural network ensemble learning algorithm which introduces a Correlation penalty term to the cost function of each individual network so that each neural network minimizes its mean-square-error (MSE) together with the Correlation. This paper describes NCL in detail and observes that the NCL corresponds to training the entire ensemble as a single learning machine that only minimizes the MSE without regularization. This insight explains that NCL is prone to overfitting the noise in the training set. The paper analyzes this problem and proposes the multiobjective regularized Negative Correlation learning (MRNCL) algorithm which incorporates an additional regularization term for the ensemble and uses the evolutionary multiobjective algorithm to design ensembles. In MRNCL, we define the crossover and mutation operators and adopt nondominated sorting algorithm with fitness sharing and rank-based fitness assignment. The experiments on synthetic data as well as real-world data sets demonstrate that MRNCL achieves better performance than NCL, especially when the noise level is nontrivial in the data set. In the experimental discussion, we give three reasons why our algorithm outperforms others.

Chien-chih Chen - One of the best experts on this subject based on the ideXlab platform.

  • Negative Correlation between frequency magnitude power law exponent and hurst coefficient in the long range connective sandpile model for earthquakes and for real seismicity
    EPL, 2012
    Co-Authors: Luciano Telesca, Chien-chih Chen
    Abstract:

    The Long-Range Connective Sandpile (LRCS) model was applied to the Italian seismicity. The Hurst exponent and the power-law slope of the frequency-size distributions for the avalanches in the LRCS model and for earthquakes in Italy are investigated. This study shows the transition of the Correlation coefficient between b and H values with different calculation window length. The result shows similar behaviors in the LRCS model and the Italy catalogue. The Negative Correlation between b and H values can be clearly seen when an appropriate window length is employed for various time series. We suggest that the Negative relationship is caused by the increasing Correlation length as the system accumulates enough energy. Also the calculation window length is an important index to display the intensity of the Negative Correlation between these two exponents. The appropriate window length can be related to the period time for the avalanches with various sandpile and seismicity time series.

  • Negative Correlation between power-law scaling and Hurst exponents in long-range connective sandpile models and real seismicity
    Chaos Solitons & Fractals, 2012
    Co-Authors: Chien-chih Chen
    Abstract:

    We propose a generic Negative Correlation between power-law scaling and Hurst exponents for size/magnitude data from real and synthetic earthquakes. The synthetic earthquakes were produced from a conceptual earthquake model, the long-range connective sandpile (LRCS) model. The LRCS model is a new modification of sandpile models that considers the random distant connection between two separated cells instead of neighboring cells. We calculated the Hurst exponent H and the power-law scaling exponent B for event size data in the LRCS model. We systematically explored the relationships between these two exponents (H and B) and conclusively obtained a Negative Correlation between H and B. We also found this Negative Correlation for real earthquake data registered in the Taiwan Central Weather Bureau (CWB) catalog. This Negative Correlation has not been demonstrated previously for real seismicity, although it has been frequently suggested.

Qiangfu Zhao - One of the best experts on this subject based on the ideXlab platform.

  • From low Negative Correlation learning to high Negative Correlation learning
    2014 International Joint Conference on Neural Networks (IJCNN), 2014
    Co-Authors: Qiangfu Zhao
    Abstract:

    Besides the studied transition learning between the two different ensemble learning algorithms such as Negative Correlation learning and balanced ensemble learning, transition learning could also implemented in Negative Correlation learning with different Correlation penalties. On one hand, Negative Correlation learning with the lower Correlation penalty named as low Negative Correlation learning might learn too much the training data while generating less Negatively correlated neural networks. On the other hand, Negative Correlation learning with the higher Correlation penalty called as high Negative Correlation learning might not be able to learn the training data, but be capable of generating highly Negatively correlated neural networks. By conducting transition learning from low Negative Correlation learning to high Negative Correlation learning, this paper shows that the ensembles could have both the good performance and the diverse individual neural networks.

  • IJCNN - From low Negative Correlation learning to high Negative Correlation learning
    2014 International Joint Conference on Neural Networks (IJCNN), 2014
    Co-Authors: Qiangfu Zhao
    Abstract:

    Besides the studied transition learning between the two different ensemble learning algorithms such as Negative Correlation learning and balanced ensemble learning, transition learning could also implemented in Negative Correlation learning with different Correlation penalties. On one hand, Negative Correlation learning with the lower Correlation penalty named as low Negative Correlation learning might learn too much the training data while generating less Negatively correlated neural networks. On the other hand, Negative Correlation learning with the higher Correlation penalty called as high Negative Correlation learning might not be able to learn the training data, but be capable of generating highly Negatively correlated neural networks. By conducting transition learning from low Negative Correlation learning to high Negative Correlation learning, this paper shows that the ensembles could have both the good performance and the diverse individual neural networks.

  • Control of Correlation in Negative Correlation learning
    2014 10th International Conference on Natural Computation (ICNC), 2014
    Co-Authors: Qiangfu Zhao
    Abstract:

    Balanced ensemble learning is developed from Negative Correlation learning by shifting the learning targets. Compared to the Negative Correlation learning, balanced ensemble learning is able to learn faster and achieve the higher accuracy on the training sets for a number of the tested classification problems. However, it has been found that the higher accuracy balanced ensemble learning obtained on the training sets, the higher risks it might be trapped in overfitting. In order to lessen the degree of overfitting in balanced ensemble learning, two parameters of the lower bound of error rate (LBER) and the upper bound of error output (UBEO) were set to decide whether a training point should be learned or ignored in the learning process. Such selective learning could prevent the ensembles from learning too much on the training set to have a good performance on the testing set. This paper show how LBER and UBEO would affect the performance of balanced ensemble learning in view of Correlation control.

  • SMC - Transition Learning between Balanced Ensemble Learning and Negative Correlation Learning
    2013 IEEE International Conference on Systems Man and Cybernetics, 2013
    Co-Authors: Qiangfu Zhao
    Abstract:

    In this paper, transition learning was introduced between balanced ensemble learning and Negative Correlation learning. The idea of transition learning is to apply balanced ensemble learning for a certain time, and then to switch to Negative Correlation learning. The short learning period with the sudden changes of learning behaviors is called transition learning. Experimental studies had been conducted to examine the learning behaviors in the transition process. It was found that the training error rates had big sudden changes in the beginning of transition process. The changes in the training error rates became smaller and smaller at the end of transition process. The more interesting results are how such sudden change on the training set would lead to the testing set. By observing the performance on the testing error rates, it was found that transition learning were able to prevent the learning from over fitting.

  • Transition Learning between Balanced Ensemble Learning and Negative Correlation Learning
    2013 IEEE International Conference on Systems Man and Cybernetics, 2013
    Co-Authors: Qiangfu Zhao
    Abstract:

    In this paper, transition learning was introduced between balanced ensemble learning and Negative Correlation learning. The idea of transition learning is to apply balanced ensemble learning for a certain time, and then to switch to Negative Correlation learning. The short learning period with the sudden changes of learning behaviors is called transition learning. Experimental studies had been conducted to examine the learning behaviors in the transition process. It was found that the training error rates had big sudden changes in the beginning of transition process. The changes in the training error rates became smaller and smaller at the end of transition process. The more interesting results are how such sudden change on the training set would lead to the testing set. By observing the performance on the testing error rates, it was found that transition learning were able to prevent the learning from over fitting.

Luciano Telesca - One of the best experts on this subject based on the ideXlab platform.

  • Negative Correlation between frequency magnitude power law exponent and hurst coefficient in the long range connective sandpile model for earthquakes and for real seismicity
    EPL, 2012
    Co-Authors: Luciano Telesca, Chien-chih Chen
    Abstract:

    The Long-Range Connective Sandpile (LRCS) model was applied to the Italian seismicity. The Hurst exponent and the power-law slope of the frequency-size distributions for the avalanches in the LRCS model and for earthquakes in Italy are investigated. This study shows the transition of the Correlation coefficient between b and H values with different calculation window length. The result shows similar behaviors in the LRCS model and the Italy catalogue. The Negative Correlation between b and H values can be clearly seen when an appropriate window length is employed for various time series. We suggest that the Negative relationship is caused by the increasing Correlation length as the system accumulates enough energy. Also the calculation window length is an important index to display the intensity of the Negative Correlation between these two exponents. The appropriate window length can be related to the period time for the avalanches with various sandpile and seismicity time series.

Mei-ling Shyu - One of the best experts on this subject based on the ideXlab platform.

  • ICSC - Negative Correlation Discovery for Big Multimedia Data Semantic Concept Mining and Retrieval
    2016 IEEE Tenth International Conference on Semantic Computing (ICSC), 2016
    Co-Authors: Mei-ling Shyu
    Abstract:

    With massive amounts of data producing each day in almost every field, traditional data processing techniques have become more and more inadequate. However, the research of effectively managing and retrieving these big data is still under development. Multimedia high-level semantic concept mining and retrieval in big data is one of the most challenging research topics, which requires joint efforts from researchers in both big data mining and multimedia domains. In order to bridge the semantic gap between high-level concepts and low-level visual features, Correlation discovery in semantic concept mining is worth exploring. Meanwhile, Correlation discovery is a computationally intensive task in the sense that it requires a deep analysis of very large and growing repositories. This paper presents a novel system of discovering Negative Correlation for semantic concept mining and retrieval. It is designed to adapt to Hadoop MapReduce framework, which is further extended to utilize Spark, a more efficient and general cluster computing engine. The experimental results demonstrate the feasibility of utilizing big data technologies in Negative Correlation discovery.

  • Negative Correlation Discovery for Big Multimedia Data Semantic Concept Mining and Retrieval
    2016 IEEE Tenth International Conference on Semantic Computing (ICSC), 2016
    Co-Authors: Mei-ling Shyu
    Abstract:

    With massive amounts of data producing each day in almost every field, traditional data processing techniques have become more and more inadequate. However, the research of effectively managing and retrieving these big data is still under development. Multimedia high-level semantic concept mining and retrieval in big data is one of the most challenging research topics, which requires joint efforts from researchers in both big data mining and multimedia domains. In order to bridge the semantic gap between high-level concepts and low-level visual features, Correlation discovery in semantic concept mining is worth exploring. Meanwhile, Correlation discovery is a computationally intensive task in the sense that it requires a deep analysis of very large and growing repositories. This paper presents a novel system of discovering Negative Correlation for semantic concept mining and retrieval. It is designed to adapt to Hadoop MapReduce framework, which is further extended to utilize Spark, a more efficient and general cluster computing engine. The experimental results demonstrate the feasibility of utilizing big data technologies in Negative Correlation discovery.