روایی سنجی و سنجش پایایی نسخه فارسی پرسشنامه انتظارات از درس فیزیک مریلند: کاربردی از مدل سؤال- پاسخ چند ارزشی

نوع مقاله : مقاله پژوهشی

نویسنده

استادیار گروه علوم تربیتی، دانشکده علوم تربیتی و روانشناسی، دانشگاه شهید چمران اهواز، اهواز، ایران

چکیده

این پژوهش به منظور روایی سنجی و سنجش پایایی ‌پرسشنامه انتظارات درس فیزیک مریلند (MPEX) که یکی از  پرکاربردترین پرسشنامه سنجش نگرش و انتظارات درس فیزیک است انجام گرفته است. این پژوهش کاربردی است و جامعه آماری آن دانش آموزان ایرانی هستند که درس فیزیک را در دبیرستان می گذرانند. نمونه‌  این پژوهش 423 دانش‌آموز (197 دختر و 226 پسر) مقطع متوسطه دوم در رشته‌های تحصیلی تجربی و ریاضی  سال تحصیلی 1399-1398 بودند. تحلیل عاملی تأییدی به‌منظور بررسی ساختار عاملی و از مدل پاسخ مدرج در نظریه سؤال- پاسخ برای تحلیل گویه‌های‌این پرسشنامه استفاده شد. نرم افزارهای LISREL و IRTPro به منظور تحلیل داده ها استفاده شد. روایی محتوایی مقیاس 93/0 و ضریب پایایی هرکدام از عوامل پرسشنامه مقادیری بین 74/0 تا 89/0 بودند. تحلیل عاملی، سازه‌های پیش‌بینی شده در پرسشنامه اصلی را مورد تأیید قرار داد و گویه‌های پرسشنامه ضمن داشتن برازش مناسب بیشتر گویه‌ها با مدل چند ارزشی پاسخ مدرج دارای پارامترهای تشخیص و پارامتر آستانه مطلوبی بودند. پژوهشگر کاربرد نسخه فارسی‌این پرسشنامه را برای بررسی انتظارات و نگرش‌ها درباره درس فیزیک توصیه می‌کند.



 

کلیدواژه‌ها


عنوان مقاله [English]

A Validity and Reliability Assessment of the Persian Version of Maryland Physics Expectations Survey: Using a Polytomous Item Response Theory Model

نویسنده [English]

  • Mojtaba Jahanifar
Department of Educational science and psychology faculty, Shahid Chamran University of Ahvaz, Ahvaz, Iran
چکیده [English]

This study was conducted to assess the validity and reliability of the Maryland Physics Expectations QuestionnaireSurvery, which is one of the most widely used instruments forto measuringe attitudes and expectations in the physics course. This is an applied research and its statistical population is consists of Iranian students who take had taken physics lessons in high school. The sample was included 423 high school students (197 girls males and 226 boysfemales) in from both the Experimentalthe science Sciences and mMathematics-Physicsal course fields of study in the academic yearduring 1399-13992019-20 . A cConfirmatory factor analysis was conducted to examine the factor structure, and while the IRT theory’s graded-response model in IRT theory was used to analyze the items of this questionnaire. The LISREL and IRTPro softwares  were used to analyze the data. The content validity of the scale was 0.93 and the reliability coefficient of each factor of the questionnaire was between 0.74 and 0.89. The fFactor analysis confirmed the structures predicted in the original questionnaire., In addition, and the questionnaire items, while having a suitable fit with the polytomous model of the graded response, the questionnaire items had the desired discriminant parameters and thresholds. The researcher recommends the use of the Persian version of this questionnaire to explore expectations and attitudes about physics.

کلیدواژه‌ها [English]

  • Physics Education
  • Physics Attitudes
  • the MPEX Questionnaire
  • Graded Response Model
  • Validity Study
  • Reliability Study
معتمدی، ا. (1386). آموزش پژوهش محور. تهران: لوح زرین.
Adams, W. K., Perkins, K. K., Podolefsky, N. S., Dubson, M., Finkelstein, N. D. & Wieman, C. E. (2006). New instrument for measuring student beliefs about physics and learning physics: The colorado learning attitudes about science survey. Physical Review Special Topics-Physics Education Research. 2(010101), 1-14.
Atkinson, T. M., Rosenfeld, B. D., Sit, L., Mendoza, T. R., Fruscione, M., Lavene, D., Basch, E. (2011). Using confirmatory factor analysis to evaluate construct validity of the Brief Pain Inventory (BPI). Journal of Pain and Symptom Management. 41(3), 558–565.
Cai, L., Thissen, D., & du Toit, S. H. C. (2011). IRTPRO for Windows [Computer software]. Lincolnwood, IL: Scientific Software International.
Chen, W. H., & Thissen, D. (1997). Local dependence indices for item pairs using item response theory. Journal of Educational and Behavioral Statistics. 22(3), 265-289.
Colton, D., Covert, R. W. (2007). Designing and Constructing Instruments for Social Research and Evaluation (1st ed.). Jossey-Bass: San Francisco.
Cortina, J. M. (1993). What is coefficient alpha? An examination of theory and applications. Journal of Applied Psychology. 78(1), 98-104.
De Ayala, R. J. (2009). The Theory and Practice of Item Response Theory. New York: Guilford Press.
Elby, A. (2001). Helping physics students learn how to learn. American Journal of Physics. 69(7), S54-S64.
Embertson, S. E., & Reise, S. P. (2000). Multivariate Applications Books Series. Item response theory for psychologists.Lawrence Erlbaum Associates Publishers.
Finch, H., & Habing, B. (2005). Comparison of NOHARM and DETECT in item cluster recovery: Counting dimensions and allocating items. Journal of Educational Measurement. 42(2), 149-170.
Haladyna, T. (1999). Developing and Validating multiple-choice test items. New Jersey: Lawrence Erlbaum.
Halek, M., Holle, D., & Bartholomeyczik, S. (2017). Development and evaluation of the content validity, practicability and feasibility of the Innovative dementia-oriented Assessment system for challenging behavior in residents with dementia. BMC Health Service Researches. 17, 554.
Halloun, I., & Hestenes, D. (1998). Interpreting VASS dimensions and profiles for physics students. Science & Education. 7, 553-577.
Hammer D. (2000). Student resources for learning introductory physics. American Journal of Physics. 68, S52.
Kane, M. (2001). Current concerns in validity theory. Journal of Educational Measurement. 38(4), 319-342.
Korte Meyer, G. (2007b). The challenge of teaching introductory physics to premedical students. The Physics Teacher. 45(9), 552-557.
Korte Meyer, G. (2007a). Correlations between student discussion behavior, attitudes, and learning. Physical Review Special Topics-Physics Education Research. 3(010101), 1-8.
Kritsadatan, N., & Wattanakasiwich, P. (2014). First year students’ and physics teachers’ expectations in learning physics: Case study in Thailand. International Journal of Innovation in Science and Mathematics Education. 22(1), 32–42.
Lacasse Y., Godbout C., & Series F. (2002). Health-related quality of life in obstructive sleep apnoea. European Respiratory Journal. 19, 499-503.
Lindstrom, C., & Sharma, M. D. (2011). Self-efficacy of first year university physics students: Do gender and prior formal instruction in physics matter? International Journal of Innovation in Science and Mathematics Education. 19(2), 1-9.
May, D. B., & Etkina, E. (2002). College physics students’ epistemological self-reflection and its relationship to conceptual learning. American Journal of Physics. 70(12), 1249-1258.
McDermott L. C. (1991). What we teach and what is learned-closing the gap. American Journal of Physics. 59(4), 301-315.
McDonald, R. P. (1997). Normal-ogive multidimensional model. In W. J. van der Linden & R. K. Hambleton (Ed.), Handbook of Modern Item Response Theory (pp. 258-269). New York: Springer Verlag.
Mundfrom, D. J., Shaw, D. G., & Ke, T. L. (2005). Minimum sample size recommendations for conducting factor analyses. International Journal of Testing. 5(2), 159–168.
Redish, E. F. (2010). Introducing students to the culture of physics: Explicating elements of the hidden curriculum. AIP Conf. Proc. 1289, 49.
Redish, E. F., Saul, J. M., & Steinberg, R. N. (1998). Student expectations in introductory physics. American Journal of Physics. 66(3), 212-224.
Sahin, A., Anil, D. (2017). The effects of test length and sample size on item parameters in item response theory. Educational Sciences: Theory and Practice. 17(1), 321–335.
Sahin, M. (2009). Correlations of students' grades, expectations, epistemological beliefs and demographics in a problem-based introductory physics course. International Journal of Environmental & Science Education. 4(2), 169-184.
Schommer M. (1990). Effects of beliefs about the nature of knowledge on comprehension. Journal of Educational Psychology. 82(3), 498-504.
Semsar, K., Knight, J. K., Birol, G. & Smith, M. K. (2011). The Colorado Learning Attitudes about Science Survey (CLASS) for use in biology. CBE Life Science Education. 10, 268.
Sharma, S., Ahluwalia P. K., & Sharma, S. K. (2011). Students’ epistemological beliefs, expectations, and learning physics: An international comparison. Physical Review Special Topics-Physics Education Research. 9, 010117.
Shi. J., Mo. X., & Sun. Z. (2012). Content validity index in scale development. Zhong Na Da Xue Xue Bao Yi Xue Ban. 37(2), 152–155.
Thacker, B. A. (2003). Recent advances in classroom physics. Reports on Progress in Physics. 66(10), 1833-1864.
Viennot, L. (1979). Spontaneous reasoning in elementary dynamics. European Journal of Science Education. 1(2), 205-221.
Wutchana, U., & Emarat, N. (2011). Student effort expectations and their learning in first-year introductory physics: A case study in Thailand. Physical Review Special Topics-Physics Education Research. 7(010111), 1-15.
Wutchana, U., Emarat, N., Arayathanitkul, K., Soankwan, C., & Chitaree, R. (2007). Student expectations in general physics course. Retrieved on April 10, 2012, fromhttp://www.sc.mahidol.ac.th/scpy/PENthai/research/paper_SPCEN2007/SPCEN_peak.pdf.
Zaman Zadeh, V., Ghahramanian, A., Rassouli, M., Abbas Zadeh, A., & Alavi, H. (2015). Design and implementation content validity Study: development of an instrument for measuring patient-centered communication. Journal of Caring Science. 4(5), 165–167.