ITEM RESPONSE THEORY ANALYSIS ON STUDENT STATISTICAL LITERACY TESTS
AbstractThis study aims to evaluate the quality of statistical literacy instruments. With the ex post facto approach, the results of the statistical literacy tests of 150 students were used. Data were analyzed using PARSCALE, based on the Item Response Theory with a two-parameter GRM (2-PL) logistic model, namely item discriminant and item difficulty. The results showed that of the 12 items analyzed, all were good, and the GRM-IRT 2-PL model was the right model for statistical literacy instruments with polytomous answer types. The elements in the statistical literacy instrument have been constructed with accurate information about statistical literacy abilities ranging from -2.5 to +1.2. The least SEM occurs when the maximum test information function is assigned to the test set; therefore, it can be recommended for item analysis on other tests. The statistical literacy test items examined in this study can be used to assess student literacy statistics in various schools and regions.
|
Keywords
Full Text:
References
Akinde, O. A., Harr, D., & Burger, P. (2017). Field Experience: Experiential Learning as Complementary to the Conceptual Learning for International Students in a Graduate Teacher Education Program. International Journal of Higher Education, 6(4), 137-143. https://doi.org/10.5430/ijhe.v6n4p137
Amelia, R. N., & Kriswantoro. (2017). Implementasi item response theory sebagai basis analisis kualitas butir soal dan kemampuan kimia siswa kota yogyakarta. Jurnal Kimia Dan Pendidikan Kimia, 2(1), 1–12.
Aristiawan, A., Retnawati, H., & Istiyono, E. (2019). Analysis of Model fit and Item Parameter of Mathematics National Examination Using Item Response Theory. JPP (Jurnal Pendidikan Dan Pembelajaran), 25(2), 40–46. https://doi.org/10.17977/um047v25i12018p040
Awopeju, O. A., & Afolabi, E. R. I. (2016). Comparative Analysis of Classical Test Theory and Item Response Theory Based Item Parameter Estimates of Senior School Certificate Mathematics Examination. European Scientific Journal, ESJ, 12(28), 263. https://doi.org/10.19044/esj.2016.v12n28p263
Crocker, L., & Algina, J. (1986). Introduction to Classical and Modern Test Theory, Holt, Rinehart & Winston.1986. ERIC.
Embretson, S. E., & Reise, S. P. (2013). Item Response Theory For Psychologists. In Item Response Theory for Psychologists. Lawrence Erlbaum Associates. https://doi.org/10.4324/9781410605269
Erguven, M. (2013). Two approaches in psychometric process: Classical test theory & item response theory. Journal of Education, 2(2), 23–30.
Hambleton, R. K., & Swaminathan, H. (1985). Assumptions ofItem Response Theory. In Item Response Theory (pp. 15–16). Springer.
Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1992). Fundamentals of Item Response Theory. In Contemporary Sociology, 21(2), 101-117. https://doi.org/10.2307/2075521
Hartati, N., & Yogi, H. P. S. (2019). Item Analysis for a Better Quality Test. English Language in Focus (ELIF), 2(1), 59-70. https://doi.org/10.24853
Hattie, J. (1985). Methodology Review: Assessing Unidimensionality of Tests and Items. Applied Psychological Measurement, 9(2), 139–164. https://doi.org/10.1177/014662168500900204
Horn, J. L. (1965). A rationale and test for the number of factors in factor analysis. Psychometrika, 30(2), 179–185. https://doi.org/10.1007/BF02289447
Ida, I. (2021). Evaluation Of The Jumat Taqwa Program (Jumtaq) To Constructing The Religious Character Of Students. Pedagogik: Jurnal Pendidikan, 8(2), 368–386. https://doi.org/10.33650/pjp.v8i2.2967
Isnani, I., Utami, W. B., Susongko, P., & Lestiani, H. T. (2019). Estimation of college students’ ability on real analysis course using Rasch model. Research and Evaluation in Education, 5(2), 95–102. https://doi.org/10.21831/reid.v5i2.20924
Jumailiyah, M. (2017). Item response theory: A basic concept. Educational Research and Reviews, 12(5), 258–266. https://doi.org/10.5897/err2017.3147
Leal Filho, W., Raath, S., Lazzarini, B., Vargas, V. R., de Souza, L., Anholon, R., Quelhas, O. L. G., Haddad, R., Klavins, M., & Orlovic, V. L. (2018). The role of transformation in learning and education for sustainability. Journal of Cleaner Production, 199(2), 286–295. https://doi.org/10.1016/j.jclepro.2018.07.017
Lorenzo-Seva, U., Timmerman, M. E., & Kiers, H. A. L. (2011). The hull method for selecting the number of common factors. Multivariate Behavioral Research, 46(2), 340–364. https://doi.org/10.1080/00273171.2011.564527
Rahim, A., & Haryanto, H. (2021). Implementation of Item Response Theory (IRT) Rasch Model in Quality Analysis of Final Exam Tests in Mathematics. Journal of Educational Research and Evaluation, 10(2), 57–65. https://doi.org/10.15294/jere.v10i2.51802
Sudaryono. (2011). Implementasi Teori Responsi Butir (Item Response Theory) Pada Penilaian Hasil Belajar Akhir di Sekolah. Jurnal Pendidikan Dan Kebudayaan, 17(6), 719–732. https://doi.org/10.24832/jpnk.v17i6.62
Sumintono, B., & Widhiarso, W. (2014). Aplikasi Model Rasch untuk Penelitian Ilmu-ilmu Sosial Trim Komunika Publishing House.
Suwarto, S. (2022). Karakteristik Tes Ilmu Pengetahuan Alam. Jurnal Pendidikan, 31(1), 109-121. https://doi.org/10.32585/jp.v31i1.2269
Vincent, W., & Shanmugam, S. K. S. (2020). The Role of Classical Test Theory to Determine the Quality of Classroom Teaching Test Items. Pedagogia : Jurnal Pendidikan, 9(1), 5–34. https://doi.org/10.21070/pedagogia.v9i1.123
Zanon, C., Hutz, C. S., Yoo, H., & Hambleton, R. K. (2016). An application of item response theory to psychological test development. Psicologia: Reflexao e Critica, 29(1), 73-86. https://doi.org/10.1186/s41155-016-0040-x
10.33650/pjp.v9i2.3800 |
Refbacks
- There are currently no refbacks.
Copyright (c) 2022 Mila Yulia Herosian, Yeni Rafita Sihombing, Delyanti Azzumarito Pulungan
This work is licensed under a CC BY-SA
Published by Islamic Faculty of Nurul Jadid University, Probolinggo, East Java, Indonesia.