Concurrent Validity of the Senior School Certificate Examinations Chemistry Items in Nigeria

Main Article Content

Temitope Babatimehin
Aghogo Okunola
Olawale Ayoola Ogunsanmi

Abstract

The study utilised the scores of examinees in WAEC and NECO Senior School Certificate Examination (SSCE) chemistry items under CTT and IRT. Non-experimental design of descriptive research type was adopted. The population was 36,182 students that registered for 2017/2018 WAEC SSCE Chemistry in Osun State. Two instruments were used to collect data; Chemistry Achievement Test Types I and II which were the adopted versions of May/June WAEC and June/July NECO Paper I 2015 respectively. A sample of 1,105 students was randomly selected. Simple random sampling was used to select 5 from 10 Local Governments Areas in the 3 senatorial districts in Osun State. Purposive sampling procedure was used to select two co-educational (one public and one private) from each LGA (30 schools) and two co-educational Federal unity schools. Data were analyzed using CTT and IRT methods of scoring, mean, standard deviation, correlation matrix and scatter plots. Result indicated that students approximated scores were high in WAEC Chemistry Items (WCI) using IRT (=22.23, SD=9.88) compared to CTT (=19.65, SD=8.15). While students approximated scores in NECO Chemistry Items (NCI), using IRT was high (=26.95, SD=11.69) compared to CTT ( = 24.55, SD = 9.32). Also, moderate correlation exists between scores of examinees’ in WCI and NCI under IRT (0.61) and CTT (0.63). The study concluded that there was a fair concurrent validity between scores of examinees in these examinations under IRT and CTT. The study recommended that IRT method be utilise for scoring items and concurrent validity in item validation.

Article Details

Section
Articles

References

Ahmed, M.F. (2014). Difficulty index of mathematics multiple-choice items of West African Examinations Council and National Examinations Council senior secondary school certificate examinations from 2006 – 2010 Journal of ATIP, 13, 25 – 31.

Adegoke, B. A. (2014). Effect of item-pattern scoring method on Senior Secondary School Students’ Ability Scores in Physics Achievement Test, West African Journal of Education, 24, 181-190.

Afolabi, E.R.I. (2012). Tests and Measurement: A tale bearer or true witness? An inaugural lecture series 253. Obafemi Awolowo University press limited.

Anastasi, A. 1988. Psychological Testing. Macmillan.

Awopeju, O. A., & Afolabi, E. R. I. (2016). Comparative analysis of classical test theory and item response theory based item parameter estimates of senior school certificate mathematics examination. European Scientific Journal, 12(28), 263-284.

Baker, F. B. (2001). The basics of item response theory. ERIC Clearinghouse on Assessment and Evaluation. Pp. 176.

Bernadine, N. N., & Augustine, U. O. (2022). Comparative Analysis of 2021 and 2022 WAEC and NECO Chemistry Multiple Choice Questions in Enugu State, Nigeria. British Journal of Education, 10(14), 7-14.

Cohen, R. J., & Swedlik, M. E. (1999). An introduction to tests and measurement. Psychological testing and assessment. 4th ed. Mayfield Publishing House.

Cronbach, L. J., & Meehl, P. E. (1955). Construct validity in psychological test. Psychological Bulletin, 52, 281-302.

Dibu-Ojerinde, O. O. (2012). General principles of test planning. Educational Tests and Measurement, Obafemi Awolowo University Press.

Dimiter, M. O. (2012). Statistical methods for validation of assessment scale data in counselling and related fields. American Counselling Association, www.counseling.org

Erguven, C., & Erguven, M. (2014). An empirical study on assessment of item-person statistics and reliability using Classical Test Theory measurement measures. Journal of Technical Science and Technology. ISSN 2298 – 0032: Pp. 25-33.

Faleye, B. A. & Dibu-Ojerinde, O. O. (2005). Some outstanding issues in assessment for learning. Paper Presented at the 2005 Annual Conference of the International Association for Educational Assessment. (IAEA), Hilton Hotel, Abuja (Nigeria).

Guler, N., Uyanik, K. G., & Teker, G. T, (2013). Comparison of classical test and item response theory in terms of item parameters. International Journal of Social Sciences Research, 2(1), 1-6

Hambleton, R., & Jones, R. (1993). Comparison of classical test theory and item response theory and their applications to test development. Educational Measurement: Issues and Practice, 12, 38 - 47.

Hambleton, R. K., Robin, F., & Xing, D. (2000). Item response models for the analysis of educational and psychological test data. In H. Tinsley & S. Brown (Eds.). Handbook of applied multivariate statistics and modelling. Academic Press.

Hassana, O. A., & Abuh, E. E. (2016). Correlational analysis of student achievement in West Africa Examination Council and National Examination Council of Nigeria in mathematics, Journal of Research in National Development, 14(1), 1-13.

Hulin, C. L., Drasgow, F., & Parsons, C. K. (1983). Item response theory: Application to psychology www. Brain bench .com

Kline, T. J. (2005). Classical test theory assumptions, equations, limitations, and item analyses. Psychological testing. Sage Publications, 5, 91-106.

Kolawole, E. B. (2007). A comparative analysis of the psychometric properties of Nigerian two examining bodies for Senior Secondary School mathematics. Research Journal of Applied Sciences, 2(8), 913-915

Krishnan, V. (2013). The early child development instruments (EDI): An item analysis using Classical Test Theory (CTT) on Alberta’s data. Early child development mapping (ECMap) project Alberta, community University partnership (CUP), Faculty of extension, University of Alberta, Edmonton, Alberta.

McDonald, P. (1999). Test theory: A unified treatment: Mabwab, N. J: Lawrence Erlbaum Associates

Meadows, M., & Billington, P. (2005). A review of the literature on marking reliability. Report produced for the National Assessment Agency

Metibemu, M. A. (2016). Comparison of classical test theory and item response theory frameworks in the development and equating of physics achievement tests in Ondo state, Nigeria. Unpublished Ph.D. Thesis, Institute of Education University of Ibadan.

Moore, D. S., Notz, W. I, and Flinger, M. A. (2013). The basic Practices of statistics (6th ed.). W. H. Freeman and Company

Natarajan, V. (2009). Basic principles of IRT and application to practical testing and assessment. MeritTrac Services (P) LTD. India

Ojerinde, O.O. & Faleye, B. A. (2005). Do they end at the same point? Journal of Social Science, (3), 239-241.

Ojerinde, D., & Ifewulu, B. C. (2012). Item unidimensionality using 2010 Unified Tertiary Matriculation Examination Mathematics pretest. A paper presented at the 2012 international conference of IAEA Kazastan.

Ojerinde, D. (2016). Lecture Modules on Item response theory (IRT), Joint Admission and Matriculation Board (JAMB). Pp. 72.

Okpala, P. N., Onocha, C.O., & Oyedeji, O. A. (1993). Measurement in Education. Jattu-Uzairue: Stirling-Horden Publishers (Nig.) Ltd.

Olutayo, J. A. (2007). Comparative Analysis of Students’ Performance in Chemistry in WAEC and NECO Senior School Certificate Examination. International Journal of Research in Education, 4(1&2), 184-200.

Olutola, A. T. (2015). Empirical analysis of item difficult and discrimination indices of senior school certificate multiple choice biology tests in Nigeria. A paper presented at the 41st annual conference of International Association of Educational Assessment (IAEA) held on 11th - 15th October, 2015 at University of Kansas, Lawrence, Kansas, USA.

Peter K. (2012). A study of the attitude of some Nigerian science students towards NECO and WAEC. Journal of Professional Science and Vocational Teachers Association of Nigeria, 12(1), 15-18

Salako, R. J., Adegoke, B.O., & Ogundipe, L. O. (2017). Performance appraisal of NECO and WAEC SSCE: An empirical evidences from mathematics and physics. International Journal of Innovative Social & Science Education Research, 5(3):1-10.

Seyi, A. I., & Clement, A. A. (2012). A correlational analysis of students’ achievement in WAEC and NECO Mathematics. Journal of Education and Practice, 3(1), 23-36

Skurnik, L. S. and Nuttal, D. L. (1968). Describing the reliability of examinations. The Statistician, 18, 119-128.

Valipour, V., & Zoghi, M. (2014). A comparative study of classical test and item response theory in estimating test item parameter in a linguistic test. Indian Journal fundamental and Applied Life Sciences. 4(54), 424-435.

Wiberg, M. (2004). Classical test theory vs item response theory: An evaluation of the theory test in Swedish driving license test (No. 50). Kluwer Academic Publications

William, D. (2000). Reliability, Validity and all that Jass Education, 29(3), 9-13