LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Liang, G; Cohn, AG (2013)
Publisher: AAAI Press
Languages: English
Types: Other
Subjects:

Classified by OpenAIRE into

ACM Ref: ComputingMethodologies_PATTERNRECOGNITION
Learning from imbalanced data is an important problem in data mining research. Much research has addressed the problem of imbalanced data by using sampling methods to generate an equally balanced training set to improve the performance of the prediction models, but it is unclear what ratio of class distribution is best for training a prediction model. Bagging is one of the most popular and effective ensemble learning methods for improving the performance of prediction models; however, the re is a major drawback on extremely imbalanced data-sets. It is unclear under which conditions bagging is outperformed by other sampling schemes in terms of imbalanced classification. These issues motivate us to propose a novel approach, unevenly balanced bagging (UBagging), to boost the performance of the prediction model for imbalanced binary classification. Our experimental results demonstrate that UBagging is effective and statistically significantly superior to single learner decision trees J48 (SingleJ48), bagging, and equally balanced bagging (BBagging) on 32 imbalanced data-sets.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • Breiman, L. 1996. Bagging predictors. Machine Learning 24(2):123-140.
    • DemsĖ‡ar, J. 2006. Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7:1-30.
    • Goebel, M. 2004. Ensemble learning by data resampling. Ph.D.
    • Hido, S., Kashima, H., and Takahashi, Y. 2009. Roughly balanced bagging for imbalanced data. Statistical Analysis and Data Mining 2(5-6):412-426.
    • Li, C. 2007. Classifying imbalanced data using a bagging ensemble variation (BEV). In Proceedings of the 45th ACM Annual Southeast Regional Conference, 203-208.
    • Liang, G., Zhu, X., and Zhang, C. 2011. An empirical study of bagging predictors for imbalanced data with different levels of class distribution. In Proceedings of the 24th Australasian Conference on Artificial Intelligence, 213-222.
    • Liang, G., Zhu, X., and Zhang, C. 2012. The effect of varying levels of class distribution on bagging for different algorithms: An empirical study. International Journal of Machine Learning and Cybernetics. http:// link.springer.com/ article/ 10.
    • 1007%2Fs13042-012-0125-5.
    • Merz, C., and Murphy, P. 2006. UCI repository of machine learning databases. http:// archive.ics.uci.edu/ ml/ .
    • Weiss, G., and Provost, F. 2003. Learning when training data are costly: The effect of class distribution on tree induction. Journal of Artificial Intelligence Research 19(1):315-354.
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article