LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Publisher: EXIT Publishing House
Languages: English
Types: Part of book or chapter of book
Subjects: aintel, csi
This chapter covers different approaches that may be taken when building an\ud ensemble method, through studying specific examples of each approach from research\ud conducted by the authors. A method called Negative Correlation Learning illustrates a\ud decision level combination approach with individual classifiers trained co-operatively. The\ud Model level combination paradigm is illustrated via a tree combination method. Finally,\ud another variant of the decision level paradigm, with individuals trained independently\ud instead of co-operatively, is discussed as applied to churn prediction in the\ud telecommunications industry.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • 1. Bousquet, O., S. Boucheron, and G. Lugosi, Introduction to Statistical Learning Theory, in Advanced Lectures on Machine Learning, U.v.L. Bousquet O. and G. R├Ątsch, Editors. 2004, Springer: Heidelberg, Germany. p. 169-207.
    • 2. Brown, G., et al., Diversity creation methods: A survey and categorisation. Journal of Information Fusion, 2005. 6(1).
    • 3. Kuncheva, L.I., Combining Pattern Classifiers: Methods and Algorithms. 2004: WileyInterscience.
    • 4. Opitz, D. and R. Maclin, Popular Ensemble Methods: An Empirical Study. Journal of Artificial Intelligence Research, 1999. 11: p. 169-198.
    • 5. Hansen, J.V., Combining Predictors: Comparison of Five Meta Machine Learning Methods. Information Sciences, 1999. 119(1-2): p. 91-105.
    • 6. Breiman, L., Bagging Predictors. Machine Learning, 1996. 24(2): p. 123-140.
    • 7. Freund, Y. and R.E. Schapire. Experiments with a new boosting algorithm. in Proceedings of the 13th International Conference on Machine Learning. 1996: Morgan Kaufmann.
    • 8. Breiman, L., Random Forests. Machine Learning, 2001. 45(1): p. 5-32.
    • 9. Hall, M., Combining Particles and Waves for Fluid Animation. 1998. p. 73.
    • 10. Domingos, P., Knowledge discovery via multiple models. 1998.
    • 11. Gert Lanckriet, N.C., Peter Bartlett, Laurent El Ghaoui, Michael Jordan,, Learning the Kernel Matrix with Semi-Definite Programming.
    • 12. Lee, S.W., S. Verzakov, and R.P. Duin. Kernel Combination Versus Classifier Combination. in Proc. 7th Int. WOrkshop, MCS 2007. 2007.
    • 13. Gabrys, B., Learning Hybrid Neuro-Fuzzy Classifier Models From Data: To Combine or not ot Combine? Fuzzy Sets and Systems, 2004. 147: p. 39-56.
    • 14. Utans, J., Weight averaging for neural networks and local resampling schemes. 1996.
    • 15. Geman, S., E. Bienenstock, and R. Doursat, Neural Networks and the Bias/Variance Dilemma. Neural Computation, 1992. 4(1): p. 1-58.
    • 16. Tumer, K. and J. Ghosh, Error Correlation and Error Reduction in Ensemble Classifiers. Connection Science, 1996. 8(3-4): p. 385-403.
    • 17. Tumer, K. and J. Ghosh, Analysis of decision boundaries in linearly combined neural classifiers. Pattern Recognition, 1996. 29(2): p. 341-348.
    • 18. Kohavi, R. and D.H. Wolpert. Bias Plus Variance Decomposition for Zero-One Loss Functions. in Machine Learning: Proceedings of the Thirteenth International Conference. 1996: Morgan Kaufmann.
    • 19. Breiman, L., Bias, Variance, and Arcing Classifiers. Breiman,L. (1996) Bias, Variance, and Arcing Classifiers, Technical Report 460, Statistics Department, University of California, 1996.
    • 20. Domingos, P. A Unified Bias-Variance Decomposition and its Applications. in Proc. 17th International Conf. on Machine Learning. 2000: Morgan Kaufmann, San Francisco, CA.
    • 21. Krogh, A. and J. Vedelsby, Neural Network Ensembles, Cross Validation, and Active Learning. NIPS, 1995. 7: p. 231-238.
    • 22. Liu, Y. and X. Yao, Ensemble learning via negative correlation. Neural Networks, 1999. 12: p. 1399-1404.
    • 23. Brown, G. and J.L. Wyatt. The Use of the Ambiguity Decomposition in Neural Network Ensemble Learning Methods. in 20th International Conference on Machine Learning (ICML'03). 2003. Washington DC, USA.
    • 24. McKay, R. and H. Abbass. Analyzing Anticorrelation in Ensemble Learning. in Proceedings of 2001 Conference on Artificial Neural Networks and Expert Systems. 2001. Otago, New Zealand.
    • 25. Islam, M.M., X. Yao, and K. Murase, A constructive algorithm for training cooperative neural network ensembles. IEEE Transactions on Neural Networks, 2003. 14(4): p. 820-834.
    • 26. Eastwood, M. and B. Gabrys, The Dynamics of Negative Correlation Learning. Journal of VLSI Signal Processing, 2007. 49: p. 251-263.
    • 27. Quinlan, J.R., Simplifying Decision Trees, in Knowledge Acquisition for KnowledgeBased Systems, B. Gaines and J. Boose, Editors. 1988, Academic Press: London. p. 239- 252.
    • 28. Kearns, M. and Y. Mansour. A fast, bottom-up decision tree pruning algorithm with near-optimal generalization. in Proc. 15th International Conf. on Machine Learning. 1998:
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article