Cognitive Diagnostic Assessment: Critical Review of Iranian Studies

Document Type : Research Paper

Authors

1 Department of Educational Psychology, Faculty of Education and Psychology, Alzahra University, Tehran, Iran

2 Ph.d of Assessment and measurement, , Department of Educational Science and Psychology, Allameh Tabatabaei University, Tehran. Iran

Abstract

Over the past decade, cognitive diagnostics has been favored by educational researchers as an approach to assessing student achievement, but the use of DCMs has been criticized. Due to the tendency of Iranian researchers towards this type of assessment, this study was conducted with the aim of critically reviewing Iranian research in this field in order to analyze its methodology. For this purpose, the main relevant Iranian databases including SID, Google Scholar, Scopus, Noormags, Magiran, and Elmnet were searched, and 32 studies were considered. Considering the criteria for inclusion and exclusion of studies, 20 studies in this field that were conducted between 2015 and 2020 were selected. In the next step, the selected studies were critically reviewed independently by the researchers using the indicators recommended by Sessoms & Henson (2018). The indicators included the thematic focus of the research, the number of competencies, the constructs measured, the structure of the test items, the Q matrix, the statistical model type, the model fit, the item fit, the question type, the sample size, the evidence of reliability and validity, the description of the competencies (latent categories, the correlation between competencies, the competency profile), and the application of the results.The results showed that a large number of the research conducted did not provide empirical support for general assumptions such as correlation between attributes, did not provide sufficient evidence for reliability and validity, did not report on the competency profile, and did not provide feedback to stakeholders. We provide some guidelines to improve the quality of such models.

Keywords


افضلی، ا.، دلاور، ع.، فلسفی‌نژاد، م. ر.، فرخی، ن. ع.، و برجعلی، ا. (1393). کاربرد مدل‌های تشخیصی شناختی در تعیین ماهیت تفاوت در عملکرد دانش‌آموزان دختر و پسر پایة اول دبیرستان در ریاضیات. دستآوردهای روانشناختی. 21(2)، 104-89.
تقییان، ح.، خدایی، ا.، بازرگان، ع.، مقدم‌زاده، ع.، و کبیری، م. (1397). ساخت آزمون خواندن برای دانش­آموزان پایۀ ششم ابتدایی با استفاده از انگارۀ سنجش‌شناختی تشخیصی. پژوهش­نامۀ آموزش زبان فارسی به غیرفارسی زبانان. 7(1)، 30-3.
جاویدان­مهر، ز.، و عنانی سراب، م. ر. (96). بررسی گستردگی و دشواری زیرمهارت­های خواندن و درک مطلب با استفاده از مدل تشخیصی شناختی جی دینا. نقد زبان و ادبیات خارجی. 14(2)، 117-99.
رحیمی، ر.، یونسی، ج.، و مکرمی، م. (1397). کاربرد سنجش تشخیصی شناختی به‌منظور تحلیل سؤال­های درک مطلب کنکور کارشناسی ‌ارشد زبان انگلیسی. اندازه­گیری تربیتی. 8(2)، 40-17.
رنجبران، ف.، و علوی، س. م. (1395). سنجش تشخیصی شناختی آزمون خواندن و درک مفاهیم برای بازخورد تشخیصی تکوینی. پژوهش­های زبان‌شناختی در زبانهای خارجی. 6(2)، 342-321.
شهمیرزادی، ن.، سیری، م.، مرعشی، ح.، و گرامی­پور، م. (1399). بررسی سوگیری در سؤالات درک مطلب آزمون مقطع دکترای رشتۀ زبان انگلیسی تحت سنجش تشخیصی شناختی. پژوهش­های زبان‌شناختی در زبان­های خارجی. 10(1)، 165-152.
محسن‌پور، م.، گویا، ز.، شکوهی یکتا، م.، کیامنش، ع. ر.، و بازرگان، ع. (1394). سنجش تشخیصی صلاحیت‌های سواد ریاضی. نوآوریهای آموزشی. (1)، 33-7.
محسن­پور، م. (1398). سنجش صلاحیت­های شناختی چندسطحی سواد ریاضی دانش‌آموزان پایۀ نهم: کاربردی از مدل پی جی دینا. مطالعات اندازه­گیری و ارزشیابی آموزشی. 9(2)، 134-109.
محمدی، ن.، دلاور، ع.، فرخی، ن. ع.، و مینایی، ا. (1396). شناسایی صفات زیربنایی سؤال­های آزمون هوش وکسلر چهار کودکان براساس توانایی باریک نظریۀ کتل-هورن-کارول با استفاده از مدل تشخیصی شناختی جی دینا. اندازه­گیری تربیتی. 7(2)، 32-1.
مفاخری، ش.، شاهورانی سمنانی، ا.، بهزادی، م. ح.، و برهمند، ع. (1398). ارائۀ مدل ساختاری مؤلفه­های مؤثر در یادگیری عبارت­ها جبری دورۀ جدید متوسطه با استفاده از روش سلسله‌مراتبی ویژگی­ها. نوآوری­های آموزشی. 18(2)، 146-127.
مقدم، ا.، فلسفی‌نژاد، م. ر.، فرخی، ن. ع.، و استاجی، م. (1395). تحلیل تشخیصی سؤال‌های بخش درک مطلب زبان انگلیسی عمومی آزمون ورودی دوره‌های دکتری با استفاده از مدل غیرجبرانی فیوژن. اندازهگیری تربیتی. 6(4)، 68-41.
مینایی، ا.، دلاور، ع.، فلسفی‌نژاد، م.ر.، کیامنش، ع. ر.، و مهاجر، ی. (1393). مدل­پردازی تشخیصی شناختی (CDM) سؤال­های ریاضیات تیمز 2007 در دانش­آموزان پایۀ هشتم ایران با استفاده از مدل یکپارچه با پارامترپردازی مجدد (RUM) و مقایسۀ مهارت­های ریاضی دانش­آموزان دختر و پسر. اندازه ­گیری تربیتی. (2)، 169-137.
Bradshaw, L., Izs´ak, A., Templin, J., & Jacobson, E. (2013). Diagnosing teachers’ understandings of rational numbers: building a multidimensional test within the diagnostic classification framework. Educational Measurement: Issues and Practice. 33(1), 2–14.
Chen, J., & de la Torre, J. (2014). A procedure for diagnostically modeling extant large-scale assessment data: The case of the programme for international student assessment in reading. Psychology. 5(18), 1967–1978.
Choi, K. M., Lee, Y.-S., & Park, Y. S. (2015). What CDM can tell about what students have learned: An analysis of TIMSS eighth grade mathematics. Eurasia Journal of Mathematics, Science, & Technology Education. 11(6), 1563–1577.
Chiu, C.-Y. (2013). Statistical refinement of the Q-matrix in cognitive diagnosis. Applied Psychological Measurement. 37(8), 598–618.
Chiu, C., & Douglas, J. (2013). A nonparametric approach to cognitive diagnosis by proximity to ideal response patterns. Journal of Classification. 30(2), 225–250.
Cui, Y., Gierl, M. J., & Chang, H.-H. (2012). Estimating classification consistency and accuracy for cognitive diagnostic assessment. Journal of Educational Measurement. 49(1), 19–38.
 
Decarlo, L. T. (2011). On the analysis of fraction subtraction data: The DINA model, classification, latent class sizes, and the Q-Matrix. Applied Psychological Measurement, 35(1), 8-26.
 
De la Torre, J. (2008). An empirically based method of q-matrix validation for the Dina model: Development and applications. Journal of Educational Measurement. 45(4), 343-362.
De la Torre, J., & Chiu, C. Y. (2016). General method of empirical Q-matrix validation. Psychometrika. 81(2), 253–273.
 Effatpanah, F (2019). application of cognitive diagnostic models to the listening section of the international English language testing system (IELTS). International Journal of Language Testing. 9(1), 1-28.
Effatpanah, F., Baghaei, P., & Boori, A. A. (2019). Diagnosing EFL learners’ writing ability: A diagnostic classification modeling analysis. Language Testing in Asia. 9(12), 1-28.
Gierl, M. J., Wang, C., & Zhou, J. (2008). Using the attribute hierarchy method to make diagnostic inferences about examinees' cognitive skills in algebra on the SAT. Journal of Technology, Learning, and Assessment. 6(6).
Gorin, J. S. (2009). Diagnostic classification models: Are they necessary? Measurement: Interdisciplinary Research and Perspectives. 7(1), 30–33.
Grant, M. J., &Booth, A. (2009). A typology of reviews: An analysis of 14 review types and associated methodologies. Health Information and Libraries Journal. 26(2), 91–108
Green, S. B., Lissitz, R. W., & Mulaik, S. (1977). Limitations of coefficient alpha as an index of text unidimensionality. Educational and Psychological Measurement. 37(4), 827–39.
Henson, R. A. (2009). Diagnostic classification models: Thoughts and future directions. Measurement: Interdisciplinary Research and Perspectives. 7(1), 34–36.
 
Jang, E. E. (2008). A framework for cognitive diagnostic assessment. In C. A. Chapelle, Y.-R. Chung, & J. Xu (Eds.), Towards adaptive CALL: Natural language processing for diagnostic language assessment (pp. 117‐131). Ames, IA: Iowa State University.
Jang, E. E. (2009). Cognitive diagnostic assessment of L2 reading comprehension ability: Validity arguments for Fusion Model application to LanguEdge assessment. Language Testing. 26(1), 31–73.
Javidanmehr, Z., & Anani Sarab, M. R. (2017). Cognitive diagnostic assessment: Issues and considerations. International Journal of Language Testing. 7(2), 73–98.
Javidanmehr, Z., & Anani Sarab, M. R. (2019). Retrofitting non-diagnostic reading comprehension assessment: application of the G-DINA model to a high stakes reading comprehension test. Language Assessment Quarterly. 16(3), 294-311.
Kabiri, M., Ghazi-Tabatabaei, M., Bazargan, A., Shokoohi-Yekta, M., & Kharrazi, K. (2016). Diagnosing competency mastery in science: An application of GDM to TIMSS 2011d. Applied Measurement in Education. 30(1), 27-38.
Kunina-Habenicht, O., Rupp, A. A., & Wilhelm, O. (2009). A practical illustration of multidimensional diagnostic skills profiling: Comparing results from confirmatory factor analysis and diagnostic classification models. Studies in Educational Evaluation. 35(2), 64–70.
Leighton, J. P., & Gierl, M. J. (2007). Why cognitive diagnostic assessment. In J. P. Leighton, & M. J. Gierl (Eds.), Cognitive diagnostic assessment for education: Theory and applications (pp. 3-18). New York: Cambridge University Press.
Li, H. (2016). Estimation of Q-matrix for DINA model using the constrained generalized Dina framework. Doctoral Ph. D. Dissertation. Graduate School of Arts and Sciences, Columbia University.
 Lei, P. W.,& Li, H. (2016). Performance of fit indices in choosing correct cognitive diagnostic models and Q-matrices. Applied Psychological Measurement. 40(6), 405–417.
Madison, M. J., & Bradshaw, L. P. (2014). Design on classification accuracy in the log-linear cognitive diagnosis model. Educational and Psychological Measurement. 75(3), 491–511.
Ravand, H., Barati, H., & Widhiarso, W. (2013). Exploring diagnostic capacity of a high stakes reading comprehension test: A pedagogical demonstration. Iranian Journal of Language Testing. 3(1), 11-37.
Ravand, H., & Robitzsch, A. (2016). Cognitive diagnostic modeling using R. Practical Assessment, Research & Evaluation. 20(11), 1-12.
Ravand, H. (2016). Application of a cognitive diagnostic model to a high-stakes reading comprehension test. Journal of Psychoeducational Assessment. 34(8), 782–799.
Ravand, H., & Robitzsch, A. (2018). Cognitive diagnostic model of best choice: A study of reading comprehension. Educational Psychology. 38(10), 1255-1277.
Ravand, H., & Baghaei, P. (2020). Diagnostic classification models: Recent developments, practical issues, and prospects. International Journal of Testing. 20(1), 24-56.
Roussos, L., DiBello, L., Stout, W., Hartz, S., Henson, R., & Templin, J. (2007). The Fusion Model Skills Diagnosis System. In J. Leighton & M. Gierl (Eds.), Cognitive Diagnostic Assessment for Education: Theory and Applications (pp. 275-318). New York: Cambridge University Press. doi:10.1017/CBO9780511611186.010.
Rupp, A. A., & Templin, J. (2008a). Unique characteristics of cognitive diagnosis models: A comprehensive review of the current state-of-the-art. Measurement: Interdisciplinary Research and Perspectives. 6(4), 219–262
Rupp, A., & Templin, J. (2008b). The effects of Q-matrix misspecification on parameter estimates and misclassification rates in the DINA model. Educational and Psychological Measurement. 68(1), 78–98.
Rupp, A. A., Templin, J., & Henson, R. A. (2010). Diagnostic measurement, theory, methods, and applications. New York: The Guilford Press.
Sessoms, J., & Henson, R. A. (2018). Applications of diagnostic classification models: A literature review and critical commentary. Interdisciplinary Research and Perspectives. 16(1), 1-17
Sinharay, S., & Haberman, S. J. (2009). How much can we reliably know about what examinees know? Measurement: Interdisciplinary Research and Perspectives. 7(1), 46–49.
Tatsuoka, K. K. (1990). Toward an integration of item-response theory and cognitive error diagnosis. In N. Frederiksen, R. Glaser, A. Lesgold, & M. G. Shafto (Eds.), Diagnostic monitoring of skill and knowledge acquisition (pp. 453-488). Hillsdale, NJ: Erlbaum.
Templin, J., & Bradshaw, L. (2013). Measuring the reliability of diagnostic classification model examinee estimates. Journal of Classification. 30(2), 251–275.
Yang, X., & Embretson, S. E. (2007). Construct validity and cognitive diagnostic assessment. In J. P. Leighton & M. J. Gierl (Eds.), Cognitive diagnostic assessment for education: Theory and applications (pp. 119–145). New York: Cambridge University Press. https://doi.org/10.1017/CBO9780511611186.005
 
 
Volume 12, Issue 2
September 2021
Pages 101-122
  • Receive Date: 27 September 2020
  • Revise Date: 08 February 2021
  • Accept Date: 20 February 2021
  • First Publish Date: 02 August 2021
  • Publish Date: 23 August 2021