Abstract
This paper presents the AdaBoost algorithm that provides for the imprecision in the calculation of weights. In our approach the obtained values of weights are changed within a certain range of values. This range represents the uncertainty of the calculation of the weight of each element of the learning set. In our study we use the boosting by the reweighting method where each weak classifier is based on the recursive partitioning method. A number of experiments have been carried out on eight data sets available in the UCI repository and on two randomly generated data sets. The obtained results are compared with the original AdaBoost algorithm using appropriate statistical tests.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Kearns, M., Valiant, L.: Cryptographic limitations on learning boolean formulae and finite automata. J. Assoc. Comput. Mach. 41(1), 67–95 (1994)
Chunhua, S., Hanxi, L.: On the Dual Formulation of Boosting Algorithms. IEEE Transactions on Pattern Analysis and Machine Intelligence 32(12), 2216–2231 (2010)
Oza, N.C.: Boosting with Averaged Weight Vectors. In: Windeatt, T., Roli, F. (eds.) MCS 2003. LNCS, vol. 2709, pp. 15–24. Springer, Heidelberg (2003)
Freund, Y., Schapire, R.: Experiments with a new boosting algorithm. In: Proceedings of the Thirteenth International Conference on Machine Learning, Bari, Italy, pp. 148–156 (1996)
Wozniak, M.: Proposition of Boosting Algorithm for Probabilistic Decision Support System. In: Bubak, M., van Albada, G.D., Sloot, P.M.A., Dongarra, J. (eds.) ICCS 2004. LNCS, vol. 3036, pp. 675–678. Springer, Heidelberg (2004)
Wozniak, M.: Boosted decision trees for diagnosis type of hypertension. In: Oliveira, J.L., Maojo, V., Martín-Sánchez, F., Pereira, A.S. (eds.) ISBMDA 2005. LNCS (LNBI), vol. 3745, pp. 223–230. Springer, Heidelberg (2005)
Kajdanowicz, T., Kazienko, P.: Boosting-based Multi-label Classification. Journal of Universal Computer Science 19(4), 502–520 (2013)
Freund, Y., Schapire, R.: A decision-theoretic generalization of on-line learning and an application to boostin. Journal of Computer and System Scienses 55(1), 119–139 (1997)
Schapire, R.E.: The Strenght of Weak Learnability. Machine Learning 5, 197–227 (1990)
Freund, Y.: Boosting a Weak Learning Algorithm by Majority. Information and Computation 121, 256–285 (1995)
Friedman, J., Hastie, T., Tibshirani, R.: Additive Logistic Regression: A Statistical View of Boosting. The Annals of Statistics 38, 337–374 (2000)
Seiffert, C., Khoshgoftaar, T.M., Hulse, J.V., Napolitano, A.: Resampling or Reweighting: A Comparison of Boosting Implementations. In: 2008 20th IEEE International Conference on Tools with Artificial Intelligence, pp. 445–451 (2008)
Dmitrienko, A., Chuang-Stein, C.: Pharmaceutical Statistics Using SAS: A Practical Guide. SAS Press (2007)
Murphy, P.M., Aha, D.W.: UCI repository for machine learning databases. Technical Report, Department of Information and Computer Science, University of California, Irvine (1994), http://www.ics.uci.edu/~mlearn/databases/
Duin, R.P.W., Juszczak, P., Paclik, P., Pekalska, E., de Ridder, D., Tax, D., Verzakov, S.: PR-Tools4.1, A Matlab Toolbox for Pattern Recognition. Delft University of Technology (2007)
Highleyman, W.H.: The design and analysis of pattern recognition experiments. Bell System Technical Journal 41, 723–744 (1962)
Derrac, J., Garcia, S., Molina, D., Herrera, F.: A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation 1(1), 3–18 (2011)
Trawinski, B., Smetek, M., Telec, Z., Lasota, T.: Nonparametric statistical analysis for multiple comparison of machine learning regression algorithms. International Journal of Applied Mathematics and Computer Science 22(4), 867–881 (2012)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Burduk, R. (2014). The AdaBoost Algorithm with the Imprecision Determine the Weights of the Observations. In: Nguyen, N.T., Attachoo, B., Trawiński, B., Somboonviwat, K. (eds) Intelligent Information and Database Systems. ACIIDS 2014. Lecture Notes in Computer Science(), vol 8398. Springer, Cham. https://doi.org/10.1007/978-3-319-05458-2_12
Download citation
DOI: https://doi.org/10.1007/978-3-319-05458-2_12
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-05457-5
Online ISBN: 978-3-319-05458-2
eBook Packages: Computer ScienceComputer Science (R0)