Ernestine Wirngo Tani


Evaluation is an essential facet of education and plays a significant role in giving feedbacks to stakeholders. Pedagogy is not complete without learners’ assessment and. Objective tests are used extensively as test format in primary school. However, conception and making remains a challenge to most teachers. This cast doubts over the quality. To mitigate the issues about its quality, each test format should undergo item analysis or task analysis. This study sets out to evaluate item and test quality of a national achievement test of English language using difficulty index (DIF), and discrimination indices (DI); to identify which task were appropriate for the respective levels. The study made use of data collected by the ministry of basic education aimed at measuring the true score of their learners in order to plan new pedagogic tools for improving the quality of reading and mathematics amongst primary school pupils. The Classical Test Theory (CTT) that utilizes two main statistics: the item difficulty index and the discrimination index were employed. Through an ex-post factor analysis results obtained showed that the national achievement test was easy, thus depicting the good performance of pupils’ whereas in reality it is the reverse. About 90% of the pupils got Items that were virtually correct consequently useless for discriminating among pupils. Task like Measurement and size for class three, addition and subtraction and familiar word identification for class five should be completely discarded as their DIF stood at 1.00. Given that, quality control is important for test development. Teachers are recommended to perform item analysis and to synchronize classroom instruction with test items to achieve instructional validity. 


Article visualizations:

Hit counter


test analysis/ task analyses, difficulty/ facility index, discriminatory indices, test statistics

Full Text:



Barud, I., Nagandla, K., & Agarwal, P. (2019). Impact of distractors in item analysis of multiple-choice questions. International Journal of Research in Medical Sciences, 7(4), 1136-1139. doi: http://dx.doi.org/10.18203/2320-6012.ijrms20191313

D'Sa, J. L. & Visbal-Dionaldo, M. L. (2017). Analysis of multiple choice questions: Item difficulty, discrimination index and distractor efficiency. International Journal of Nursing Education, 9(3), 109-114. doi:10.5958/0974-9357.2017.00079.4.

Gronlund, N. E. (1998). Assessment of student achievement. 6th edition. Boston: Allyn and Bacon.

Haladyna, T. M. (2004). MC formats. In: Haladyna TM (ed). Developing and validating multiple-choice test item. 3rd edn. Mahwah, New Jersey: Lawrence Erlbaum Associates; 67±96.

Hijji, B. M. (2017). Flaws of multiple choice questions in teacher-constructed nursing examinations: A pilot descriptive study. Journal of Nursing Education, 56(8), 490-496. doi: 10.3928/01484834-20170712-08

Houser, J. (2018). Nursing research: Reading, using, and creating (4th ed.). Burlington, MA: Jones & Bartlett Learning.

Lange, A., Lehmann, I. J. & Mehrens, W. A. (1967). Using item analysis to improve tests. Journal of Educational Measurement, 4(2), 65-68; http://www.jstor.org/stable/1434299.

Longman. Walsh, K. (2008). Advice on writing multiple choice questions (MCQs). BMJ Careers 2005. Available at http://careers.bmj.com/careers/advice/view-article.html?id=616, accessed on 23 May 2008.

Mahjabeen, W., Alam, S., Hussan, U., Zafar, T., Butt, R. Konain, S., & Rizvi, M. (2018). Difficulty index, discrimination index, and distractor efficiency in multiple choice questions. Annals of Pakistan Institute of Medical Sciences, 4,310-315. Retrieved from https://www.researchgate.net/publication/323705126_Difficulty_Index_Discrimination_Index_and_Distractor_Efficiency_in_Multiple_Choice_Questions

Mannion, C. A., Hnatyshyn, T., O'Rae, A., Beck, A. J., & Patel, S. (2018). Nurse Educators and Multiple-Choice Examination Practices. Retrieved from http://hdl.handle.net/1880/108887

Mukherjee, P. & Lahiri, S. K. (2015). Analysis of multiple-choice questions (MCQs): Item and Test Statistics from an assessment in a medical college of Kolkata, West Bengal. Journal of Dental and Medical Sciences, 14(12), 47-52. www.iosrjournals.org.

Musa, A., Shaheen, S., Elmardi, A., & Ahmed, A. (2018). Item difficulty & item discrimination as quality indicators of physiology MCQ examinations at the Faculty of Medicine, Khartoum University. Khartoum Medical Journal, 11(02), 1477-1486. Retrieved from https://www.researchgate.net/publication/328583573_Item_difficulty_item_discrimination_as_quality_indicators_of_physiology_MCQ_examinations_at_the_Faculty_of_Medicine_Khartoum_University

Nedeau-Cayo, R. (2013). Assessment of item-writing flaws in multiple-choice questions. Journal for Nurses in Professional Development, 29(2), 52-57. doi:10.1097/NND.0b013e318286c2f1

Odukoya, J. A., Adekeye, O., Igbinoba, A. O., and Afolabi, A. (2017). Item analysis of university-wide multiple-choice objective examinations: the experience of a Nigerian private university. European Journal of Methodology, 52(3), 983–997. doi: https://doi.org/10.3928/01484834-20170712-08

Oermann, M. H. & Gaberson K. B. (2014). Evaluation and testing in nursing education (4th ed). New York, NY: Springer Publishing Company.

Polit, D. F. & Yang, F. M. (2015). Measurement and the measurement of change. Philadelphia: Wolters Kluwer.

Popham, J. W. (2008). Classroom assessment: What teachers need to know (5th ed.) Boston, MA: Pearson Education, Inc. Research into Higher Education and Open University Press.

Rehman, A., Aslam, A. & Hassan, S. H. (2018). Item analysis of multiple choice questions. Pakistan Oral and Dental Journal, 38(2), 291-293. Retrieved from https://www.podj.com.pk/index.php/podj/article/view/245

Reichert, T. G. (2011). Assessing the use of high-quality multiple-choice exam questions in undergraduate nursing education: Are educators making the grade? Retrieved from Sophia, the St. Catherine University repository. https://sophia.stkate.edu/ma_nursing/15

Remmers H. H., Gage, N. L. & Rummel, J. I. (1967). A Practical introduction to measurement and evaluation (2nd ed.). Delhi: Universal Book Stall

Sharma, S. R. (2000). Modern teaching strategies. New Delhi: Omsons Publications.

Swanson, D. B., Holtzman K. Z., Albee K, & Clauser, B. E. (2006). Psychometric characteristics and response times for content-parallel extended-matching and one-best-answer items in relation to number of options. Acad Med, 81, 52±55.

Swanson, D. B., Holtzman K. Z., Clauser, B. E., Sawhill, A. J. (2005). Psychometric characteristics and response times for one-best-answer questions in relation to number and sources of options. Acad Med, 80, 93-96.

Tracy, D. A. (2012). School improvement: Revitalize your school with strategic planning. USA: Xlibris Corporation.

Walsh, K. (2008). Answering multiple choice questions. BMJ Careers 2005. Available at http://careers.bmj.com/careers/advice/view-article.html?id=891 (accessed on 23 May 2008).

Wirngo T. E. (2019). Evaluation processes and examination irregularities among students in Cameroon Higher Education. Ph.D. Thesis, defended in the Faculty of Education, University of Yaoundé 1- Cameroon.

DOI: http://dx.doi.org/10.46827/ejes.v8i8.3861


  • There are currently no refbacks.

Copyright (c) 2021 Ernestine Wirngo Tani

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2015-2022. European Journal of Education Studies (ISSN 2501 - 1111) is a registered trademark of Open Access Publishing Group. All rights reserved.

This journal is a serial publication uniquely identified by an International Standard Serial Number (ISSN) serial number certificate issued by Romanian National Library (Biblioteca Nationala a Romaniei). All the research works are uniquely identified by a CrossRef DOI digital object identifier supplied by indexing and repository platforms. All authors who send their manuscripts to this journal and whose articles are published on this journal retain full copyright of their articles. All the research works published on this journal are meeting the Open Access Publishing requirements and can be freely accessed, shared, modified, distributed and used in educational, commercial and non-commercial purposes under a Creative Commons Attribution 4.0 International License (CC BY 4.0).