Authors (s)

(1)  Mila Yulia Herosian   (Universitas Prima Indonesia, Nort Sumatera)  
(2) * Yeni Rafita Sihombing   (Universitas Prima Indonesia, North Sumatera)  
(3)  Delyanti Azzumarito Pulungan   (Universitas Prima Indonesia, North Sumatera)  
(*) Corresponding Author


This study aims to evaluate the quality of statistical literacy instruments. With the ex post facto approach, the results of the statistical literacy tests of 150 students were used. Data were analyzed using PARSCALE, based on the Item Response Theory with a two-parameter GRM (2-PL) logistic model, namely item discriminant and item difficulty. The results showed that of the 12 items analyzed, all were good, and the GRM-IRT 2-PL model was the right model for statistical literacy instruments with polytomous answer types. The elements in the statistical literacy instrument have been constructed with accurate information about statistical literacy abilities ranging from -2.5 to +1.2. The least SEM occurs when the maximum test information function is assigned to the test set; therefore, it can be recommended for item analysis on other tests. The statistical literacy test items examined in this study can be used to assess student literacy statistics in various schools and regions.


Item Response Theory; Model 2-paramater Logistics; Statistical Literacy Test.

Full Text: PDF


Akinde, O. A., Harr, D., & Burger, P. (2017). Field Experience: Experiential Learning as Complementary to the Conceptual Learning for International Students in a Graduate Teacher Education Program. International Journal of Higher Education, 6(4), 137-143.

Amelia, R. N., & Kriswantoro. (2017). Implementasi item response theory sebagai basis analisis kualitas butir soal dan kemampuan kimia siswa kota yogyakarta. Jurnal Kimia Dan Pendidikan Kimia, 2(1), 1–12.

Aristiawan, A., Retnawati, H., & Istiyono, E. (2019). Analysis of Model fit and Item Parameter of Mathematics National Examination Using Item Response Theory. JPP (Jurnal Pendidikan Dan Pembelajaran), 25(2), 40–46.

Awopeju, O. A., & Afolabi, E. R. I. (2016). Comparative Analysis of Classical Test Theory and Item Response Theory Based Item Parameter Estimates of Senior School Certificate Mathematics Examination. European Scientific Journal, ESJ, 12(28), 263.

Crocker, L., & Algina, J. (1986). Introduction to Classical and Modern Test Theory, Holt, Rinehart & Winston.1986. ERIC.

Embretson, S. E., & Reise, S. P. (2013). Item Response Theory For Psychologists. In Item Response Theory for Psychologists. Lawrence Erlbaum Associates.

Erguven, M. (2013). Two approaches in psychometric process: Classical test theory & item response theory. Journal of Education, 2(2), 23–30.

Hambleton, R. K., & Swaminathan, H. (1985). Assumptions ofItem Response Theory. In Item Response Theory (pp. 15–16). Springer.

Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1992). Fundamentals of Item Response Theory. In Contemporary Sociology, 21(2), 101-117.

Hartati, N., & Yogi, H. P. S. (2019). Item Analysis for a Better Quality Test. English Language in Focus (ELIF), 2(1), 59-70.

Hattie, J. (1985). Methodology Review: Assessing Unidimensionality of Tests and Items. Applied Psychological Measurement, 9(2), 139–164.

Horn, J. L. (1965). A rationale and test for the number of factors in factor analysis. Psychometrika, 30(2), 179–185.

Ida, I. (2021). Evaluation Of The Jumat Taqwa Program (Jumtaq) To Constructing The Religious Character Of Students. Pedagogik: Jurnal Pendidikan, 8(2), 368–386.

Isnani, I., Utami, W. B., Susongko, P., & Lestiani, H. T. (2019). Estimation of college students’ ability on real analysis course using Rasch model. Research and Evaluation in Education, 5(2), 95–102.

Jumailiyah, M. (2017). Item response theory: A basic concept. Educational Research and Reviews, 12(5), 258–266.

Leal Filho, W., Raath, S., Lazzarini, B., Vargas, V. R., de Souza, L., Anholon, R., Quelhas, O. L. G., Haddad, R., Klavins, M., & Orlovic, V. L. (2018). The role of transformation in learning and education for sustainability. Journal of Cleaner Production, 199(2), 286–295.

Lorenzo-Seva, U., Timmerman, M. E., & Kiers, H. A. L. (2011). The hull method for selecting the number of common factors. Multivariate Behavioral Research, 46(2), 340–364.

Rahim, A., & Haryanto, H. (2021). Implementation of Item Response Theory (IRT) Rasch Model in Quality Analysis of Final Exam Tests in Mathematics. Journal of Educational Research and Evaluation, 10(2), 57–65.

Sudaryono. (2011). Implementasi Teori Responsi Butir (Item Response Theory) Pada Penilaian Hasil Belajar Akhir di Sekolah. Jurnal Pendidikan Dan Kebudayaan, 17(6), 719–732.

Sumintono, B., & Widhiarso, W. (2014). Aplikasi Model Rasch untuk Penelitian Ilmu-ilmu Sosial Trim Komunika Publishing House.

Suwarto, S. (2022). Karakteristik Tes Ilmu Pengetahuan Alam. Jurnal Pendidikan, 31(1), 109-121.

Vincent, W., & Shanmugam, S. K. S. (2020). The Role of Classical Test Theory to Determine the Quality of Classroom Teaching Test Items. Pedagogia : Jurnal Pendidikan, 9(1), 5–34.

Zanon, C., Hutz, C. S., Yoo, H., & Hambleton, R. K. (2016). An application of item response theory to psychological test development. Psicologia: Reflexao e Critica, 29(1), 73-86.

Article View

Abstract views : 213 times | PDF files viewed : 182 times

Dimensions, PlumX, and Google Scholar Metrics



  • There are currently no refbacks.

Copyright (c) 2022 Mila Yulia Herosian, Yeni Rafita Sihombing, Delyanti Azzumarito Pulungan

This work is licensed under a CC BY-SA

Published by Islamic Faculty of Nurul Jadid University, Probolinggo, East Java, Indonesia.