An Experimental Study of K* Algorithm

Автор: Dayana C. Tejera Hernández

Журнал: International Journal of Information Engineering and Electronic Business(IJIEEB) @ijieeb

Статья в выпуске: 2 vol.7, 2015 года.

Бесплатный доступ

Machine Learning techniques are taking place in all areas of our lives, to help us to make decisions. There is a large number of algorithms available for multiple purposes and appropriate for specific data types. That is why it is required to pay special attention to decide which is the recommended technique, to use in each case. K Star is an instance-based learner that tries to improve its performance for dealing with missing values, smoothness problems and both real and symbolic valued attributes; but it is not known much information about how the way it faces attribute and class noisy, and with mixed values of the attributes in the datasets. In this paper we made six experiments with Weka, to compare K Star and other important algorithms: Naïve Bayes, C4.5, Support Vector Machines and k-Nearest Neighbors, taking into account its performance classifying datasets with those features. As a result, K Star demonstrated to be the best of them in dealing with noisy attributes and with imbalanced attributes.

Еще

Machine Learning techniques, K Star, k-Nearest Neighbors, Naïve Bayes, C4.5, Support Vector Machines, machine learning algorithms comparison

Короткий адрес: https://sciup.org/15013335

IDR: 15013335

Список литературы An Experimental Study of K* Algorithm

  • Alpaydn, E., Introduction to Machine Learning, T. Dietterich, Editor. 2010: London, England.
  • Witten, I.H., E. Frank, and M.A. Hall, Data Mining. Practical Machine Learning tools and techniques, ElSevier, Editor. 2011.
  • Martin, B., Instance-Based Learning: Nearest Neighbour with Generalisation, in Department of Computer Science. 1995, University of Waikato: Hamilton, New Zeland. p. 83.
  • Cleary, J. and L. Trigg, K*: An Instance-based Learner Using an Entropic Distance Measure, in 12th International Conference on Machine Learning. 1995. p. 108-114.
  • Uzun, Y. And G. Tezel, Rule Learning With Machine Learning Algorithms And Artificial Neural Networks. Journal of Seljuk University Natural and Applied Science, 2012. 1(2).
  • Er, E., Identifying At-Risk Students Using Machine Learning Techniques: A Case Study with IS 100. International Journal of Machine Learning and Computing, 2012. 2(4): p. 279.
  • Wu, X., et al., Top 10 algorithms in data mining. Springer, 2008. 14: p. 1-37.
  • Kalapanidas, E., et al., Machine Learning Algorithms: a Study on Noise Sensitivity. First Balkan Conference in Informatics, 2003 November; pp. 356-365.
  • Vijayarani, S.and Muthulakshmi, M., Comparative Analysis of Bayes and Lazy Classification Algorithms. International Journal of Advanced Research in Computer and Communication Engineering, 2013 August; 2(8): 3118-3124.
  • Douglas, P. K., et al., Performance comparison of machine learning algorithms and number of independent components used in fMRI decoding of belief vs. disbelief. Neuroimage. 2011 May 15; 56(2): 544-553.
  • Hall, M.A., et al., WEKA. 2009.
  • Alcalá-Fdez, J., et al., KEEL Data-Mining Software Tool: Data Set Repository, Integration of Algorithms and Experimental Analysis Framework. Multiple-Valued Logic & Soft Computing, 2011. 17: p. 255-287.
  • Dêmsar, J., Statistical Comparisons of Classifiers over Multiple Data Sets. Journal of Machine Learning Research, 2006. 7: p. 1-30.
Еще
Статья научная