CFP last date
01 January 2025
Reseach Article

Predicting Student’s Performance using Machine Learning

by Vrushali A. Sungar, Pooja D. Shinde, Monali V. Rupnar
Communications on Applied Electronics
Foundation of Computer Science (FCS), NY, USA
Volume 7 - Number 11
Year of Publication: 2017
Authors: Vrushali A. Sungar, Pooja D. Shinde, Monali V. Rupnar
10.5120/cae2017652730

Vrushali A. Sungar, Pooja D. Shinde, Monali V. Rupnar . Predicting Student’s Performance using Machine Learning. Communications on Applied Electronics. 7, 11 ( Dec 2017), 11-15. DOI=10.5120/cae2017652730

@article{ 10.5120/cae2017652730,
author = { Vrushali A. Sungar, Pooja D. Shinde, Monali V. Rupnar },
title = { Predicting Student’s Performance using Machine Learning },
journal = { Communications on Applied Electronics },
issue_date = { Dec 2017 },
volume = { 7 },
number = { 11 },
month = { Dec },
year = { 2017 },
issn = { 2394-4714 },
pages = { 11-15 },
numpages = {9},
url = { https://www.caeaccess.org/archives/volume7/number11/789-2017652730/ },
doi = { 10.5120/cae2017652730 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2023-09-04T20:01:52.544903+05:30
%A Vrushali A. Sungar
%A Pooja D. Shinde
%A Monali V. Rupnar
%T Predicting Student’s Performance using Machine Learning
%J Communications on Applied Electronics
%@ 2394-4714
%V 7
%N 11
%P 11-15
%D 2017
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Education plays vital role in a student’s life. While choosing any field, number of options available in front of student. Student’s marks, aptitude, family background, educational environment are main essential factors while selecting a career path and these factors act as a training set to the learning system for classification. With time educational records are accumulating and increasing rapidly. To handle this data along with new features without forgetting previously learnt knowledge, incremental learning technique is introduced by machine learning. Incremental learning algorithm handles previous knowledge to take future decisions and update the system. Knowledge is represented by combining different classifiers for identification of student’s features for his/her career growth. In this paper, ensemble technique is used with incremental algorithm for student’s career choice and results over real world data sets are used to validate the effectiveness of this method.

References
  1. Chady El Moucary, “Data Mining for Engineering Schools Predicting Students’ Performance and Enrollment in Masters Programs,” International Journal of Advanced Computer Science and Applications,Vol. 2, No. 10, 2011.
  2. R.S.J.D. Baker, “Data Mining for Education,” In International Encyclopedia of Education, vol. 7, B. McGaw, P. Peterson, E.Baker (Eds.), 3e, Oxford, UK: Elsevier, pp. 112-118, 2010.
  3. Vikas Chirumamilla, Bhagya Sruthi T, Sasidhar Velpula, Indira Sunkara, “A Novel approach to predict Student Placement Chance with Decision Tree Induction”, Double Blind Peer Reviewed Journal Vol.7, Issue 1, 2014, pp 78-88.
  4. Surjeet Kumar Yadav , Saurabh Pal ,” Data Mining: A Prediction for Performance Improvement of Engineering Students using Classification”, World of Computer Science and Information Technology Journal (WCSIT) ISSN: 2221-0741 Vol. 2, No. 2, 51-56, 2012
  5. Galit.et.al, “Examining online learning processes based on log files analysis: a case study”. Research, Reflection and Innovations in Integrating ICT in Education 2007.
  6. Roozbeh Razavi-Far, Member, IEEE, Piero Baraldi, and Enrico Zio, Senior Member, IEEE, “Dynamic Weighting Ensembles for Incremental Learning and Diagnosing New Concept Class Faults in Nuclear Power Systems”, IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL. 59, NO. 5, OCTOBER 2012.
  7. Haibo He, Senior Member, IEEE, Sheng Chen, Student Member, IEEE, Kang Li, Member, IEEE, and Xin Xu, Member, IEEE,” Incremental Learning from Stream Data”, IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 22, NO. 12, DECEMBER 2011 1901.
  8. Geoffrey Holmes, Richard Kirkby, Bernhard Pfahringer , “A batch –incremental learning for mining data streams”
  9. Devi Parikh and Robi Polikar, Member, IEEE, “An Ensemble-Based Incremental Learning Approach to Data Fusion”, IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS—PART B: CYBERNETICS, VOL. 37, NO. 2, APRIL 2007.
  10. L. Xu, A. Krzyzak, and C. Y. Suen, “Methods of combining multiple classifiers and their applications to handwriting recognition,” IEEE Trans. Syst., Man, Cybern., vol. 22, no. 3, pp. 418–435, May/Jun. 1992.
  11. K.Woods,W. P. J. Kegelmeyer, and K. Bowyer, “Combination of multiple classifiers using local accuracy estimates,” IEEE Trans. Pattern Anal.Mach. Intell., vol. 19, no. 4, pp. 405–410, Apr. 1997.
  12. L. I. Kuncheva, J. C. Bezdek, and R. P. W. Duin, “Decision templates for multiple classifier fusion: An experimental comparison,” Pattern Recognit.,vol. 34, no. 2, pp. 299–314, 2001.
  13. Pattern Anal. Mach. Intell, “A theoretical study on six classifier fusion strategies,” IEEETrans.vol. 24, no. 2, pp. 281–286, Feb. 2002.
  14. R. A. Jacobs, M. I. Jordan, S. J. Nowlan, and G. E. Hinton, “Adaptive mixtures of local experts,” Neural Comput., vol. 3, no. 1, pp. 79–87, 1991.
  15. M. J. Jordan and R. A. Jacobs, “Hierarchical mixtures of experts and the EM algorithm,” Neural Comput., vol. 6, no. 2, pp. 181–214, 1994.
  16. H. Drucker, C. Cortes, L. D. Jackel, Y. LeCun, and V. Vapnik, “Boosting and other ensemble methods,” Neural Comput., vol. 6, no. 6 pp. 1289–1301, 1994.
  17. D. H. Wolpert, “Stacked generalization,” Neural Netw., vol. 5, no. 2,pp. 241–259, 1992
  18. B. V. Dasarathy and B. V. Sheela, “Composite classifier system design: Concepts and methodology,” Proc. IEEE, vol. 67, no. 5, pp. 708–713,May 1979.
  19. V.A.Sungar, “Psychological Impact on Student’s Behavior” , International Journal of Computer Science., ISSN:vol.5, Nov.2017.
Index Terms

Computer Science
Information Sciences

Keywords

Incremental learning classifiers machine learning knowledge