Assessment of Ability Estimate and Model Fit of Students in 2020 NECO Mathematics in Benue State, Nigeria

Main Article Content

Onuh Omale
Gloria Adaku Dike
C. A. Chibundum

Abstract

The basic aim of test development is to construct a test of desired quality by choosing the appropriate items through item analysis and ensuring their reliability and validity. In developing quality test items to effectively measure students’ achievement, it is pertinent that the best practices in test construction be employed by NECO. The study was guided by two research questions. The study adopts a non-experimental design. The study was carried out in the Benue State of Nigeria. The population for the study comprises 18,252 Senior Secondary School three (SSS3) student who registered and sat for the NECO Mathematics Examinations in 2020. The sample for this study consists of 1,825 students out of the 18,252 that registered and sat ‘for that Mathematics examination. The sample size was arrived at by taking 10% of the population. Data collected was analyzed using Ability Estimate and model fit statistics build in jmetrik for research questions, at 0.05 level of significance. The findings revealed that students’ ability ranges from 0.03 above while the data was fitted into 3prarmeter logistic model Based on the results of this study it was concluded that the test items are within the acceptable ranges of ability estimate and fitted to 3-PLM can be utilized in comparing students’ latent abilities for sound educational decision in our schools. And it was recommended that examination bodies, researchers that wish to use IRT in solving measurement problems.

Article Details

Section
Articles

References

Abonyi, S.O. (2009). Instrumentation in Behavioural Research; A Practical Approach. 2ndedition, Enugu. Fulladu Publishing Company. Pp. 33 – 65.

Adonu, I.I. (2014). Psychometric analysis of WAEC and NECO practical Physics tests using partial credit model. A Ph.D. thesis. Department of Science Education, University of Nigeria, Nsukka. Nigeria.

Aduloju, M.O., Obinne, A.D.E & Omale, O. (2017) Detection of Gender Bias in General Study Examination of University of Agriculture Makurdi Using Differential Item Functioning Techniques African Journal of Theory and Practice of Educational Assessment (AJTPEA) Vol 5 pp 87-100

Agah, J. (2015). Relative Efficiency of Test Scores Equating Methods in Comparison of Students’ Continuous Assessment Measures. A Ph.D. Thesis. Department of Science Education, University of Nigeria, Nsukkka. Nigeria

American Educational Research Association, and National Council on Measurement in Education (2014). Standards for Educational and Psychological Testing T.p. verso. Include index. ISBN 978-0-935302-35-6 (alk. paper)

Anastasia, A, & Urbina, S. (2006). Psychological testing. New Delhi: Prentice Hall

Asiret, A.M. & Sunbul, A.S. (2016). Investigating Test Equating Methods in Small Samples Through Various Factors. Educational Sciences: Theory and Practice. 16, 647 – 668.

Atsua, T.G, Uzoeshi, V.I & Oludi, P. (2018). Equating 2015 and 2016 Basic Education Certificate Examination on Civic Education using Classical Test Theory and Item Response Theory in Oyo state, Nigeria. Journal of Pristine, 14(1) 2250-9593.

Ayanwale, M.A., Adeleke, J.O. & Mamadelo, T.I. (2018). An Assessment of Item Statistics Estimates of Basic Education Certificate Examination through Classical Test Theory and Item Response Theory Approach. International Journal of Educational Research Review, 3(4), 55 – 67.

Azpsychology (2010). Importance of testing in psychology and education. Retrieved from http://www.a2zpsychology.com/articles/importance_of_testing_in_psychology.htm

Bichi, A.A., Hafiz, H. & Bello, S.A. (2016). Evaluation of North West University, Kano Post-UTME Test Items Using Item Response Theory. International Journal of Evaluation and Research in Education (IJERE), 5(4), 261 – 270.

Burghes, D. (2011). International comparative study in Mathematics training: Recommendations for initial teacher training in England. Education Trust. Retrieved https://www.nationalstemcentre.org.uk/res/documents/page/International%20comparative %20study%20in%20mathematics%20teacher%20training.pdf/

Chan, J. C., McDermott, K. B & Roediger, H. L., III. (2006). Retrieval-induced facilitation: Initially non-tested material can benefit from prior testing of related material. Journal of Experimental Psychology, 135, 553–571

Chikezie, I. J. (2017). Assessment of Unidimensionality of West African Senior School

Certificate Examination in Chemistry with Principal Component Analysis and Item Response Theory Model. African Journal of Theory and Practice of Educational Assessment.vol.5, November 2017, 47-57

Cyrinus B. E., Idaka E. I., and Michael A. M.(2017) Item level diagnostics and model - data fit in item response theory (IRT) using BILOG - MG v3.0 and IRTPRO v3.0 programmes GLOBAL JOURNAL OF EDUCATIONAL RESEARCH VOL 16,: 87-94 DOI: http://dx.doi.org/10.4314/gjedr.v16i2.2

Ekanem, S. A., Ekanem, R. S., Ejue, J. B., & Amimi, P. B. (2010). Science and technology research for sustainable development in Africa: The imperative of education. African Research Review, 4 (3b), 71-89.

Eleje, L.I. &Esomonu, N.P. (2018). Test of Achievement in Quantitative Economics for Secondary Schools: Construction and Validation Using Item Response Theory, Asian Journal of Education and Training 4(1), 18 – 28.

Emaikwu, S. O. (2015). Fundamentals of Research Methodology and Statistics. Selfers Academic Press Ltd., Makurdi, Nigeria.

Federal Republic of Nigeria (2013). National Policy on Education. (Revised) Lagos. NERDC Press.

Hambleton, R. K., & Swaminathan, H. (2013). Item response theory: Principles and applications. Netherlands: Springer.

Joubert, M. (2013). Mathematics is important. Retrieved from

https://mathsreports.wordpress.com/overall-narrative/mathematics-is-important

Kimberlin, C. L., & Winterstein, A. G. (2008). Validity and reliability of measurement instruments used in research. American journal of health-system pharmacy: AJHP: official journal of the American Society of Health-System Pharmacists, 65(23), 2276–2284. https://doi.org/10.2146/ajhp070364.

Kose, I. A., 2014. Assessing model data fit of unidimensional item response theory in simulated data. Educational Research and Reviews, 9, (17): 642-649. Retrieved from: http/www.academicjournals.org/ERR.

Little, J. L., Bjork, E. L., Bjork, R. A., & Angello, G. (2012). Multiple-choice tests exonerated, at least of some charges: Fostering test-induced learning and avoiding test-induced forgetting. Psychological Science, 23, 1337–1344.

Liu, Y., and Maydeu-Olivares, A. (2014). Identifying the source of misfit in item response theory models. Multivariate Behav. Res. 49, 354–371. doi: 10.1080/00273171.2014.910744

Murayama, K. (2009). Objective Test Items. Retrieved from https://www.education.com/ reference/article/objective-test-items/.30/09/2020.

Nworgu B. G. (2011). Differential item functioning: A critical issue in regional quality assurance. Paper presented in NAERA conference.

Olabode, J.O. & Adeleke, J.O. (2015). Comparative analysis of item local independence of WAEC and NECO 2012 Mathematics (Objectives) test items. ASSEREN Journal of Educational Research and Development (AJERD) 1&2, 182-190.

Oku, K. & Iweka, F. (2018). Development, Standardization and Application of Chemistry Achievement Test using One-Parameter Logistic Model (1-PLM) of Item Response Theory (IRT). American Journal of Educational Research, 6(3), 1-58. Retrieved from. http://pubs.sciepublicon education/6/3/11/index.html#cor on 05/09/18.

Orlando, M., & Thissen, D. (2003). Further investigation of the performance of S-X2 : An item fit index for use with dichotomous item response theory models. Applied Psychological Measurement, 27, 289-298.

Osarumwense, H.J. (2019). Assessment of Model-fit for 2016 and 2017 Biology Multiple Choice Test Items of the National Business and Technical Examination Board. International Journal for Innovation Education and Research, 7(4), 12 – 22.

Sinharay, S., 2005. Assessing fit of unidimensional item response theory models using a Bayesian approach. Journal Educational Measurement. 42, (4): 375-394.

Wells, C. S., Wollack, J. A and Serlin, R. C., 2015. An equivalency test for model fit. Paper presented at the annual meeting of the National Council on Measurement in Education, Montreal, Canada

Vorderman, C., Porkess, R., Budd, C., Dunne, R., & Rahman-hart, P. (2011). A world-class mathematics education for all our young people. London.