Abstract
In this paper we consider the combination of two ensemble techniques, both capable of producing diverse binary base classifiers. Adaboost, a version of Boosting is combined with Output Coding for solving multiclass problems. Decision trees are chosen as the base classifiers, and the issue of tree pruning is addressed. Pruning produces less complex trees and sometimes leads to better generalisation. Experimental results demonstrate that pruning makes little difference in this framework. However, on average over nine benchmark datasets better accuracy is achieved by incorporating unpruned trees.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
D.W. Aha and R. L. Bankert. Cloud classification using error-correcting output codes. Artificial Intelligence Applications: Natural Resources, Agriculture, and Environmental Science, 11(1):13–28, 1997.
E.L. Allwein, R.E. Schapire, and Y. Singer. Reducing multi-class to binary: A unifying approach for margin classifiers. Machine learning research, 1:113–141, 2000.
G. Bakiri and T. Dietterich. Achieving high-accuracy text-to-speech with machine learning, 1999.
A. Berger. Error-correcting output coding for text classification. In IJCA’99, Workshop on machine learning for information filtering, 1999.
C.L. Blake and Merz C.J. Uci repository of machine learning databases. Technical report, Irvine, Univ. of Calif., Inf. and Comp. Scienc, 1998.
Leo Breiman, J. H. Freidman, R. A. Olshen, and C.J. Stone. Classification and regression trees. Wadsworth International Group, 1984.
L. A. Breslow and D. W. Aha. Simplifying decision trees: A survey. Knowledge Engineering Review, pages 1–40, 1997.
K. Crammer and Y. Singer. On the learnability and design of output codes for multiclass problems. Machine Learning, to appear.
T. G. Dietterich D. Margineantu. Pruning adaptive boosting. In International Conference on Machine Learning, pages 211–218. Morgan Kaufmann, 1997.
T. G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, 40(2):139–158, 2000.
T.G Dietterich and G. Bakiri. Error-correcting output codes: A general method for improving multiclass inductive learning programs. In Proceedings of the Ninth National Conference on Artificial Intelligence (AAAI-91), pages 572–577. AAAI Press, 1991.
T.G. Dietterich and G Bakiri. Solving multi-class learning problems via error-correcting output codes. Journal of Artificial Intelligence Research, 2:263–286, 1995.
F. Esposito, D. Malerba, and G. Semeraro. A comparative analysis of methods for pruning decision trees. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(5):476–491, May 1997.
G. M. James and T. Hastie. The error coding method and PICT’ rsS. Computational and Graphical Statistics, 7:377–387, 1998.
J. Kittler, R. Ghaderi, T. Windeatt, and G. Matas. Face verification using error correcting output codes. In Computer Vision and Pattern Recognition CVPR01, volume 1, pages 755–760, Hawaii, December 2001. IEEE Press.
J. Ross Quinlan. Personal communication from Quinlan.
R. Quinlan. Induction of decision tree. Machine Learning, 1:81–106, 1986.
R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo,California, 1993.
R.J. Quinlan. Simplyfying decision trees. International Journal of Man-Machine Studies, 27:221–234, 1987.
R.E. Schapire. Using output codes to boost multiclass learning problems. In 14th International Conf. on Machine Learning, pages 313–321. Morgan Kaufman, 1997.
T. Windeatt and A. Ardeshir. Tree pruning for output coded ensembles. In International Conference of Pattern Recognition, Quebec, Canada, 2002. submitted.
T. Windeatt and G. Ardeshir. Boosting unpruned and pruned decision trees. In Applied Informatics, Preceedings of the IASTED International Symposia, pages 66–71, 2001.
T. Windeatt and G. Ardeshir. An empirical comparison of pruning methods for ensemble classifiers. In IDA 2001. Springer-Verlag, Lecture notes in computer science, 2001.
T. Windeatt and R. Ghaderi. Multi-class learning and error-correcting code sensitivity. Electronics Letters, 36(19):1630–1632, Sep 2000.
T. Windeatt and R. Ghaderi. Binary labelling and decision level fusion. Information Fusion, 2(2):103–112, 2001.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Windeatt, T., Ardeshir, G. (2002). Boosted Tree Ensembles for Solving Multiclass Problems. In: Roli, F., Kittler, J. (eds) Multiple Classifier Systems. MCS 2002. Lecture Notes in Computer Science, vol 2364. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45428-4_4
Download citation
DOI: https://doi.org/10.1007/3-540-45428-4_4
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-43818-2
Online ISBN: 978-3-540-45428-1
eBook Packages: Springer Book Archive


