JJES » JJES Issues
 
 Yarmouk Journals
Home
Editorial Board
Consulative Board
Manuscript Submission
Publication Guidelines
JJES Issues  
Contact Address
 

 

الأداء التفاضلي لفقرات ومموهات اختبار للتفكير الناقد لطلبة المرحلة الجامعية تبعًا لمتغير الجنس

كوثر العبد القادر و يوسف السوالمه

‏ جامعة اليرموك - الأردن.‏

Doi://10.47015/17.2.6                           

JJES,17(2), 2021, 251-269

ملخص: هدفت الدراسة الحالية إلى فحص الأداء التفاضلي لفقرات ومموهات اختبار للتفكير الناقد على المستوى الجامعي وفقًا لمتغير جنس الطالب. تكوّن الاختبار المستهدف بالدراسة من (49) فقره من نوع الاختيار من متعدد، لكل منها أربعة بدائل، وتغطي فقراته مهارات التفكير الناقد المتمثلة في التحليل، والتفسير، والتقويم، والشرح، والاستدلال، والتبرير الإحصائي والاحتمالي، والتعرف إلى الافتراضات المنصوص وغير المنصوص عليها، والتركيز. ولتحقيق هدف الدراسة، تم تحليل استجابات عينة مكونة من (930) طالبًا وطالبة تم اختيارهم وفق أسلوب العينة العشوائية العنقودية من جامعتي اليرموك والبلقاء التطبيقية، وتم استخدام طريقة نسبة الأرجحية لمانتل-هانزل للكشف عن الأداء التفاضلي للفقرات والمموهات. وقد أشارت النتائج إلى وجود (8) فقرات ذات أداء تفاضلي منها (6) فقرات لصالح الذكور، وكان حجم الأداء التفاضلي صغيرًا في (4) فقرات ومتوسطًا في (3) فقرات وكبيرًا في فقرة واحدة. كما أشارت النتائج أن هناك (11) مموهًا ذات أداء تفاضلي منها (7) مموهات لصالح الذكور، وكان حجم الأداء التفاضلي كبيرًا في مموهين ومتوسطًا في (9) مموهات.

(الكلمات المفتاحية: الأداء التفاضلي للفقرات، الأداء التفاضلي للمموهات، اختبار التفكير الناقد، التحيز، طريقة مانتل-هانزل)


 

Differential Item and Distractor Functioning in a Critical Thinking Test among University Students According to Gender 

Kawthar Al-Abd Al-Qader and Yousef Al-Sawalmeh, Yarmouk University, Jordan.

 

Abstract: This study aimed to examine gender-related differential item and distractor functioning in a critical- thinking test among university students. The test consisted of (49) four alternative multiple-choice items, covering critical thinking-skills representing: Analysis, interpretation, evaluation, explanation, reasoning, statistical and probabilistic reasoning, identification of assumptions and concentration. The responses of a cluster random sample of (930) male and female students from Yarmouk University and Al-Balqa Applied University were analyzed. Mantel-Haenszel odds ratio method was used to examine differential item and distractor functioning. The results indicated that there were (8) items with differential functioning, in favor of males in (6) items. The size of the differential functioning was small in 4 items, moderate in (3) items and large in one item. Also, there were (11) distractors with differential functioning in favor of males in (7) distractors. The size of differential functioning was large in (2) distractors and moderate in (9) distractors.

(Keywords: Differential Item Functioning, Differential Distractor Functioning, Critical Thinking Test, Bias, Mantel- Haenszel)

 

References

Ajlouni, J. (2016). Examining the differential distractors functioning of math test in the Jordanian national assessment of knowledge Economy for sex and school location. Ph.D. Dissertation, Yarmouk University, Irbid, Jordan.

Aladaileh, S. (1996). The influence of sex and educational level on confidence in learning mathematics and its relationship to academic achievement in mathematics for sixth and tenth grades in the governmental schools of Al-Karak governerate. Master Thesis. Mutah University, Al-Karak, Jordan.

Al Karki, W., & Mahadeen, S. (2019). Critical thinking level among Mutah University students and its relationship with cognitive motivation. Dirasat: Educational Sciences, 46(1), 321-342.

Anamezie, R., & Nnadi, F. (2018). Evaluation of differential distractor functioning of physics acheivement Battery for quality assurance using multinomial log-linear model. International Journal of Modern Management Sciences, 7(1), 28-39.

Autawi, E. (2004). Disclosing the differential performance of the gender variant of the eighth-grade essential general science exam items in the fourth Amman education dirrectorateal. Master's Thesis. Amman Arab University, Amman, Jordan.

Banks, K. (2004). Exploring racial differences in items that differ in cultural characteristics through differential bundle and distractor functioning. (Order No. 3138573, The University of Wisconsin – Milwaukee). ProQuest Dissertations and Theses, 91-91. Retrieved from: http://search. proquest.com/ docview/305111981?accountid=48928.(305111981).

Bowell, T. (2017). Response to the editorial education in a post-truth world. Educational Philosophy and Theory, 49(6), 582–585

Brookfield, D. (1997). Assessing critical thinking. New Directions for Adult & Continuing Education, 75, 17-29.

Camilli, G., & Shepard, L. (1994). Methods for identifying biased test items. Thousand Oaks, CA: SAGE Publications.

Diwi, John. (1960). A Research for certainty (A translation of Ahmed Al Ahwani). Eissa Al-Babi Al-Halabi Press.

Ellis, B., & Raju, S. (2003). Test and item bias: what they are, what they aren'n and how to detect them: Measuring-up, assessment issue for teachers, counselors and administrators. (ERIC Document Reproduction Service No. ED480042).

Erikson, G., & Erikson, M. (2019). Learning outcomes and critical thinking: Ggood intentions in conflict. Studies in Higher Education, 44(12), 2293-2303.

Evers, A., Muñiz, J., Hagemeister, C., Høstmælingen, A., Lindley, P., Sjöberg, A., & Bartram, D. (2013). Assessing the quality of tests: Revision of the EFPA review model. Psicothema, 25(3), 283-291.

Garner, M., & Engelhard Jr, G. (1999). Gender differences in performance on multiple- choice and constructed response mathematics items. Applied Measurement in Education, 12(1), 29-51.

Ghadia, I., Abu Bakar, K., Alwia, N., & Taliba, O. (2013). Gender analysis of critical thinking disposition instrument among University Putra Malaysia undergraduate students. Recent Technological Advances in Education,27-33.Retreived on March, 31,2020 from: http://www.wseas.us/e-library/conferences/ 2013/Malaysia/EDUETE/EDUETE-03.pdf

Gomez-Benit, J., Sireci, S., Padilla. J., Hidalgo, M., & Benitez, I. (2018). Differential item functioning: Beyond validity evidence based on internal structure. Psicothema, 30(1), 104-109.

Greenberg, A. (2010). Fighting bias with statistics: Detecting gender differences in responses to items on a preschool science assessment. (Order No. 3424789, University of Miami). ProQuest Dissertations and Theses, 104.Retrieved from: http://search.proquest. com/docview/ 756923372? accountid= 48928.

Haladyna, T. (2004). Developing and validating multiple-choice test items. Mahwah, NJ: Lawrence Erlbaum Associates.

Havnes, A., &  Prøitz, S. (2016). Why using learning outcomes in higher education? Exploring the grounds for academic resistance and reclaiming the value of unexpected learningEducational Assessment, Evaluation and Accountability, 28(3), 205–223.

Innabi, H., & Dodeen, H. (2006). Content analysis of gender-related differential items functioning TIMIS item in mathematics in Jordan. School Science and Mathematics, 106(8),147-189.

International Conference on Education Evaluation. (2018). Future skills: development and assessment. December, 4-6, 2018, Riyadh.

Jalabi, Susan. (2005). The basics of building psychological and educational tests and standards. Aladdin Publishing and Distribution House.

Jaradat, Abdullah. (2003). A comparison between the Mantl-Hanzel’s method and the difficulty method of disclosing paragraphs. Unpublished Master's Thesis. Mu'tah University, Al-Karak.

Karakaya, I. (2012). An investigation of item bias in science & technology subtests and mathematics subtests in level determination exam (LDE). Educational Science: Theory & practice, 12(1), 222-229.

Koon, S. (2010). A comparison of methods for detecting differential distractor functioning. (Order No. 3415232, The Florida State University). ProQuest Dissertations and Theses, 93. Retrieved from: http://search. proquest.com/docview/734610226?

Leach, B. (2011). Critical thinking skills as related to university students' gender and academic discipline. Dissertation, East Tennessee State University.

Macfarlane, B. (2017)Freedom to learn: The threat to student academic freedom and why it needs to be reclaimedLondonRoutledge.

Mahmoud, F. (2010). Differential performance of grade-6 science reference test elements and MUTs designed according to the paragraph response theory. Journal of the Open University of Jerusalem for Research and Studies, (44), 135-123.

Martinkova, P., Drabinova, A., Liaw, Y., Sanders, E., McFarland, J., & Price, R. (2017). Checking equity: Why differential item functioning analysis should be a routine part of developing conceptual assessment. CBE-Life Science Education, 16(2), PMC free article.

McWhorter, K.T., & Collins, H. (1992). Study & thinking skills in college (2nd edn.). Authors.

Middleton, K., & Laitusis, C. (2007). Examining test items for differential distractor functioning among students with learning disabilities. Educational Testing Service. Research Report. Retrieved on February, 9, 2020 from: www.ets.org/Media/Research/pdf/ RR-07-43 pdf.

Mubarak, W. (2010). Differential performance of the science test paragraphs in the international study of PISA 2006. Ph.D. Dissertation, Yarmouk University, Irbid, Jordan.

Oudeh, A. (2010). Measurement and evaluation in the pedagogical process. 4th Edition, Dar Al-Amal.

Paul, R. (1998). Critical thinking: Placing it at the heart of ethics instruction. Journal of Development Education, 22(2), 36-38.

Penfield, R. D., & Camilli, G. (2007). Differential item functioning and item bias. In: S. Sinharay, & C. R. Rao (Eds.), Handbook of statistics, Vol. 26: Psychometrics (pp. 125-167). Elsevier.

Penfield, R. D. (2008). An odds ratio approach for assessing differential distractor functioning effects under the nominal response model. Journal of Educational Measurement, 45(3), 247-269.

Penfield, R. (2010). DDFS: Differential distractor functioning software. Applied Psychological Measurement, 34(8), 646-647.

Petress, K. (2004). Critical thinking: An extended definition. Education, 124(3), 461-466.