Abstract
Data mining technique is the process of analyzing data from different perspectives and summarizing it into useful information. Classification refers to the data mining problem of attempting to predict the category of categorical data by building a model based on some predictor variables. The goal of data classification is to organize and categorize data in distinct classes. It does not require any priori knowledge of the class statistical distribution in data sources. ANN can be trained to distinguish the criteria used to classify and it can do so in a generalized manner allowing successful classification of new inputs not used during training. Back propagation as a training algorithm for ANN works well for classification. This Paper shows the issue of improving the fitness of BPN algorithm and the performance analysis of various classification techniques like Naïve Bayes, Bayesian network, Support Vector Machine and GABPN discussed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Goldberg, D.E.: Genetic algorithms in search, optimization, and machine learning. Pearson Education (2001)
Gupta, R.K., Bhunia, A.K.: An Application of real-coded Genetic Algorithm for integer linear programming. AMO-Advanced Modeling and Optimization 8(1) (2006)
Srinivas, V., Thompson, G.L.: Benefit-cost Analysis of coding techniques for the Primal Transportation Algorithm. Journal of the Association for Computing Machinery 20, 194–213 (1973)
Mohan, M., Gupta, P.K.: Operations research, Methods and Solutions. Sultan Chand and Sons, Reading (1992)
Mak, B.L., Sockel, H.: Info. & Mgmt, A confirmatory factor analysis of IS employee motivation and retention (2001)
Hornik, K., Stnchcombe, M., White, H.: Multilayer Feedforward Networks are Universal Approximators. Neural Network 2, 359–366 (1989)
Garg, P.: Advanced in Computer Science & Engineering. MacMillan Publication (2009)
Vijyalakshmi Pai, G.A., Rajasekaran, S.: Neural networks, fuzzy logic and genetic algorithms. In: Synthesis and applications. Prentice-Hall of India, Reading (2004)
Kullback, S., Leibler, R.A.: On Information and Sufficiency. The Annals of Mathematical Statistics 22(1), 79–86 (1951)
Akaike, H.: A new look at the statistical model identification. IEEE Transactions on Automatic Control 19(6), 716–723 (1974)
Linhart, H., Zucchini, W.: Model Selection. John Wiley and Sons (1986)
Hurvich, C.M., Tsai, C.: Regression and Time Series Model Selection in Small Samples. Biometrika 76, 297–307 (1989)
Cavanaugh, J.E.: Unifying the Deriviations for the Akaike and Corrected Akaike Information Criteria. Statistics & Probability Letters 33, 201–208 (1997)
Zhang, M.: Artificial Neural Networks. Victoria University of Wellington
Plagianakos, V.P., Magoulas, G.D., Vrahatis, M.N.: Learning rate adaptation in stochastic gradient descent, Department of Mathematics, University of Patras
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Panchal, G., Ganatra, A., Kosta, Y.P., Panchal, D. (2012). Performance Analysis of Classification Techniques Using Different Parameters. In: Kannan, R., Andres, F. (eds) Data Engineering and Management. ICDEM 2010. Lecture Notes in Computer Science, vol 6411. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-27872-3_34
Download citation
DOI: https://doi.org/10.1007/978-3-642-27872-3_34
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-27871-6
Online ISBN: 978-3-642-27872-3
eBook Packages: Computer ScienceComputer Science (R0)