Skip to main content

Boosted Tree Ensembles for Solving Multiclass Problems

  • Conference paper
  • First Online:
Multiple Classifier Systems (MCS 2002)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2364))

Included in the following conference series:

  • 800 Accesses

  • 11 Citations

Abstract

In this paper we consider the combination of two ensemble techniques, both capable of producing diverse binary base classifiers. Adaboost, a version of Boosting is combined with Output Coding for solving multiclass problems. Decision trees are chosen as the base classifiers, and the issue of tree pruning is addressed. Pruning produces less complex trees and sometimes leads to better generalisation. Experimental results demonstrate that pruning makes little difference in this framework. However, on average over nine benchmark datasets better accuracy is achieved by incorporating unpruned trees.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. D.W. Aha and R. L. Bankert. Cloud classification using error-correcting output codes. Artificial Intelligence Applications: Natural Resources, Agriculture, and Environmental Science, 11(1):13–28, 1997.

    Google Scholar 

  2. E.L. Allwein, R.E. Schapire, and Y. Singer. Reducing multi-class to binary: A unifying approach for margin classifiers. Machine learning research, 1:113–141, 2000.

    Article  MathSciNet  Google Scholar 

  3. G. Bakiri and T. Dietterich. Achieving high-accuracy text-to-speech with machine learning, 1999.

    Google Scholar 

  4. A. Berger. Error-correcting output coding for text classification. In IJCA’99, Workshop on machine learning for information filtering, 1999.

    Google Scholar 

  5. C.L. Blake and Merz C.J. Uci repository of machine learning databases. Technical report, Irvine, Univ. of Calif., Inf. and Comp. Scienc, 1998.

    Google Scholar 

  6. Leo Breiman, J. H. Freidman, R. A. Olshen, and C.J. Stone. Classification and regression trees. Wadsworth International Group, 1984.

    Google Scholar 

  7. L. A. Breslow and D. W. Aha. Simplifying decision trees: A survey. Knowledge Engineering Review, pages 1–40, 1997.

    Google Scholar 

  8. K. Crammer and Y. Singer. On the learnability and design of output codes for multiclass problems. Machine Learning, to appear.

    Google Scholar 

  9. T. G. Dietterich D. Margineantu. Pruning adaptive boosting. In International Conference on Machine Learning, pages 211–218. Morgan Kaufmann, 1997.

    Google Scholar 

  10. T. G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, 40(2):139–158, 2000.

    Article  Google Scholar 

  11. T.G Dietterich and G. Bakiri. Error-correcting output codes: A general method for improving multiclass inductive learning programs. In Proceedings of the Ninth National Conference on Artificial Intelligence (AAAI-91), pages 572–577. AAAI Press, 1991.

    Google Scholar 

  12. T.G. Dietterich and G Bakiri. Solving multi-class learning problems via error-correcting output codes. Journal of Artificial Intelligence Research, 2:263–286, 1995.

    MATH  Google Scholar 

  13. F. Esposito, D. Malerba, and G. Semeraro. A comparative analysis of methods for pruning decision trees. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(5):476–491, May 1997.

    Google Scholar 

  14. G. M. James and T. Hastie. The error coding method and PICT’ rsS. Computational and Graphical Statistics, 7:377–387, 1998.

    Article  MathSciNet  Google Scholar 

  15. J. Kittler, R. Ghaderi, T. Windeatt, and G. Matas. Face verification using error correcting output codes. In Computer Vision and Pattern Recognition CVPR01, volume 1, pages 755–760, Hawaii, December 2001. IEEE Press.

    Google Scholar 

  16. J. Ross Quinlan. Personal communication from Quinlan.

    Google Scholar 

  17. R. Quinlan. Induction of decision tree. Machine Learning, 1:81–106, 1986.

    Google Scholar 

  18. R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo,California, 1993.

    Google Scholar 

  19. R.J. Quinlan. Simplyfying decision trees. International Journal of Man-Machine Studies, 27:221–234, 1987.

    Article  Google Scholar 

  20. R.E. Schapire. Using output codes to boost multiclass learning problems. In 14th International Conf. on Machine Learning, pages 313–321. Morgan Kaufman, 1997.

    Google Scholar 

  21. T. Windeatt and A. Ardeshir. Tree pruning for output coded ensembles. In International Conference of Pattern Recognition, Quebec, Canada, 2002. submitted.

    Google Scholar 

  22. T. Windeatt and G. Ardeshir. Boosting unpruned and pruned decision trees. In Applied Informatics, Preceedings of the IASTED International Symposia, pages 66–71, 2001.

    Google Scholar 

  23. T. Windeatt and G. Ardeshir. An empirical comparison of pruning methods for ensemble classifiers. In IDA 2001. Springer-Verlag, Lecture notes in computer science, 2001.

    Google Scholar 

  24. T. Windeatt and R. Ghaderi. Multi-class learning and error-correcting code sensitivity. Electronics Letters, 36(19):1630–1632, Sep 2000.

    Google Scholar 

  25. T. Windeatt and R. Ghaderi. Binary labelling and decision level fusion. Information Fusion, 2(2):103–112, 2001.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Windeatt, T., Ardeshir, G. (2002). Boosted Tree Ensembles for Solving Multiclass Problems. In: Roli, F., Kittler, J. (eds) Multiple Classifier Systems. MCS 2002. Lecture Notes in Computer Science, vol 2364. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45428-4_4

Download citation

  • DOI: https://doi.org/10.1007/3-540-45428-4_4

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43818-2

  • Online ISBN: 978-3-540-45428-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics