Adaboost

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 26784 Experts worldwide ranked by ideXlab platform

Holger Schwenk - One of the best experts on this subject based on the ideXlab platform.

  • using boosting to improve a hybrid hmm neural network speech recognizer
    International Conference on Acoustics Speech and Signal Processing, 1999
    Co-Authors: Holger Schwenk
    Abstract:

    "Boosting" is a general method for improving the performance of almost any learning algorithm. A previously proposed and very promising boosting algorithm is Adaboost. In this paper we investigate if Adaboost can be used to improve a hybrid HMM/neural network continuous speech recognizer. Boosting significantly improves the word error rate from 6.3% to 5.3% on a test set of the OGI Numbers 95 corpus, a medium size continuous numbers recognition task. These results compare favorably with other combining techniques using several different feature representations or additional information from longer time spans. In summary, we can say that the reasons for the impressive success of Adaboost are still not completely understood. To the best of our knowledge, an application of Adaboost to a real world problem has not yet been reported in the literature either. In this paper we investigate if Adaboost can be applied to boost the performance of a continuous speech recognition system. In this domain we have to deal with large amounts of data (often more than 1 million training examples) and inherently noisy phoneme labels. The paper is organized as follows. We summarize the Adaboost algorithm and our baseline speech recognizer. We show how Adaboost can be applied to this task and we report results on the Numbers 95 corpus and compare them with other classifier combination techniques. The paper finishes with a conclusion and perspectives for future work.

  • ICASSP - Using boosting to improve a hybrid HMM/neural network speech recognizer
    1999 IEEE International Conference on Acoustics Speech and Signal Processing. Proceedings. ICASSP99 (Cat. No.99CH36258), 1999
    Co-Authors: Holger Schwenk
    Abstract:

    "Boosting" is a general method for improving the performance of almost any learning algorithm. A previously proposed and very promising boosting algorithm is Adaboost. In this paper we investigate if Adaboost can be used to improve a hybrid HMM/neural network continuous speech recognizer. Boosting significantly improves the word error rate from 6.3% to 5.3% on a test set of the OGI Numbers 95 corpus, a medium size continuous numbers recognition task. These results compare favorably with other combining techniques using several different feature representations or additional information from longer time spans. In summary, we can say that the reasons for the impressive success of Adaboost are still not completely understood. To the best of our knowledge, an application of Adaboost to a real world problem has not yet been reported in the literature either. In this paper we investigate if Adaboost can be applied to boost the performance of a continuous speech recognition system. In this domain we have to deal with large amounts of data (often more than 1 million training examples) and inherently noisy phoneme labels. The paper is organized as follows. We summarize the Adaboost algorithm and our baseline speech recognizer. We show how Adaboost can be applied to this task and we report results on the Numbers 95 corpus and compare them with other classifier combination techniques. The paper finishes with a conclusion and perspectives for future work.

Wang Yuan-qing - One of the best experts on this subject based on the ideXlab platform.

  • Research of the Real Adaboost Algorithm
    Computer Science, 2010
    Co-Authors: Wang Yuan-qing
    Abstract:

    In the current artificial intelligence and pattern recognition,Real Adaboost Algorithm,as for high accuracy rate and very fast speed,has been used more widely.As a result,we researched the theoretical basis of the Real Adaboost Algorithm conscientiously and analyzed the training procedures of classifiers based on the Real Adaboost Algorithm meticulously.In this course,we probed into the relationship between the mathematical variables involved in the algorithm; deduced the mathematical process involved in the algorithm quantitatively,and analyzed the reasons of problems appearing in training procedures qualitatively.At last,in order to improve the Real Adaboost Algorithm,we brought up several suggestions.

Xuezhi Wen - One of the best experts on this subject based on the ideXlab platform.

  • a rapid learning algorithm for vehicle classification
    Information Sciences, 2015
    Co-Authors: Xuezhi Wen, Ling Shao, Yu Xue, Wei Fang
    Abstract:

    A fast learning algorithm is introduced for real-time vehicle classification.A fast feature selection method for Adaboost is presented by combining a sample's feature value with its class label.A rapid incremental learning algorithm of Adaboost is designed. Adaboost is a popular method for vehicle detection, but the training process is quite time-consuming. In this paper, a rapid learning algorithm is proposed to tackle this weakness of Adaboost for vehicle classification. Firstly, an algorithm for computing the Haar-like feature pool on a 32×32 grayscale image patch by using all simple and rotated Haar-like prototypes is introduced to represent a vehicle's appearance. Then, a fast training approach for the weak classifier is presented by combining a sample's feature value with its class label. Finally, a rapid incremental learning algorithm of Adaboost is designed to significantly improve the performance of Adaboost. Experimental results demonstrate that the proposed approaches not only speed up the training and incremental learning processes of Adaboost, but also yield better or competitive vehicle classification accuracies compared with several state-of-the-art methods, showing their potential for real-time applications.

  • An improved algorithm based on Adaboost for vehicle recognition
    The 2nd International Conference on Information Science and Engineering, 2010
    Co-Authors: Xuezhi Wen, Yuhui Zheng
    Abstract:

    An Improved algorithm based on Adaboost is proposed to solve the problem of much time consumed for training Adaboost classifier as well as the problem of performance and storage space based on SVM (Support Vector Machines) and NN (Neural Networks) classifiers in vehicle recognition. Experimental results demonstrate that the proposed approach has better performance than the traditional methods and has less time consuming in training process than traditional Adaboost algorithm and shows promising perspective.

Cesare Furlanello - One of the best experts on this subject based on the ideXlab platform.

  • Parallelizing Adaboost by weights dynamics
    Computational Statistics & Data Analysis, 2007
    Co-Authors: Stefano Merler, Bruno Caprile, Cesare Furlanello
    Abstract:

    Adaboost is one of the most popular classification methods. In contrast to other ensemble methods (e.g., Bagging) the Adaboost is inherently sequential. In many data intensive real-world situations this may limit the practical applicability of the method. P-Adaboost is a novel scheme for the parallelization of Adaboost, which builds upon earlier results concerning the dynamics of Adaboost weights. P-Adaboost yields approximations to the standard Adaboost models that can be easily and efficiently distributed over a network of computing nodes. Properties of P-Adaboost as a stochastic minimizer of the Adaboost cost functional are discussed. Experiments are reported on both synthetic and benchmark data sets.

Jufu Feng - One of the best experts on this subject based on the ideXlab platform.

  • a refined margin analysis for boosting algorithms via equilibrium margin
    Journal of Machine Learning Research, 2011
    Co-Authors: Liwei Wang, Masashi Sugiyama, Cheng Yang, Zhihua Zhou, Zhaoxiang Jing, Jufu Feng
    Abstract:

    Much attention has been paid to the theoretical explanation of the empirical success of Adaboost. The most influential work is the margin theory, which is essentially an upper bound for the generalization error of any voting classifier in terms of the margin distribution over the training data. However, important questions were raised about the margin explanation. Breiman (1999) proved a bound in terms of the minimum margin, which is sharper than the margin distribution bound. He argued that the minimum margin would be better in predicting the generalization error. Grove and Schuurmans (1998) developed an algorithm called LP-Adaboost which maximizes the minimum margin while keeping all other factors the same as Adaboost. In experiments however, LP-Adaboost usually performs worse than Adaboost, putting the margin explanation into serious doubt. In this paper, we make a refined analysis of the margin theory. We prove a bound in terms of a new margin measure called the Equilibrium margin (Emargin). The Emargin bound is uniformly sharper than Breiman's minimum margin bound. Thus our result suggests that the minimum margin may be not crucial for the generalization error. We also show that a large Emargin and a small empirical error at Emargin imply a smaller bound of the generalization error. Experimental results on benchmark data sets demonstrate that Adaboost usually has a larger Emargin and a smaller test error than LP-Adaboost, which agrees well with our theory.

  • ICIP - Online Adaboost ECOC for image classification
    2010 IEEE International Conference on Image Processing, 2010
    Co-Authors: Hongwen Huo, Jufu Feng
    Abstract:

    We present a novel online algorithm called online Adaboost ECOC (error-correcting output codes) for image classification problems. In recent years, Adaboost is very successful in many domains such as object detection in images and videos. It is a representative large margin classifier for binary classification problems and is efficient for on-line learning. However, image classification is a typical multi-class problem. It is difficult to use Adaboost here, especially in an online version of image classification problem. In this paper, we combine online Adaboost and ECOC algorithm to solve online multi-class image classification problems. We perform online Adaboost ECOC on MNIST handwritten digit, ORL face and UCI image database. The results show our algorithm's accuracy and robustness.