Evaluation of multiple-choice questions by item analysis, from an online internal assessment of 6th semester medical students in a rural medical college, West Bengal

Background: Properly constructed single best-answer multiple choice questions (MCQs) or items assess higher-order cognitive processing of Bloom's taxonomy and accurately discriminate between high and low achievers. However, guidelines for writing good test items are rarely followed, leading to...

Full description

Saved in:
Bibliographic Details
Main Authors: Sharmistha Bhattacherjee (Author), Abhijit Mukherjee (Author), Kallol Bhandari (Author), Arup Jyoti Rout (Author)
Format: Book
Published: Wolters Kluwer Medknow Publications, 2022-01-01T00:00:00Z.
Subjects:
Online Access:Connect to this object online.
Tags: Add Tag
No Tags, Be the first to tag this record!

MARC

LEADER 00000 am a22000003u 4500
001 doaj_96bd79e88a9047d2bfaaeed4d04cf3e7
042 |a dc 
100 1 0 |a Sharmistha Bhattacherjee  |e author 
700 1 0 |a Abhijit Mukherjee  |e author 
700 1 0 |a Kallol Bhandari  |e author 
700 1 0 |a Arup Jyoti Rout  |e author 
245 0 0 |a Evaluation of multiple-choice questions by item analysis, from an online internal assessment of 6th semester medical students in a rural medical college, West Bengal 
260 |b Wolters Kluwer Medknow Publications,   |c 2022-01-01T00:00:00Z. 
500 |a 0970-0218 
500 |a 1998-3581 
500 |a 10.4103/ijcm.ijcm_1156_21 
520 |a Background: Properly constructed single best-answer multiple choice questions (MCQs) or items assess higher-order cognitive processing of Bloom's taxonomy and accurately discriminate between high and low achievers. However, guidelines for writing good test items are rarely followed, leading to generation and application of faulty MCQs. Materials and Methods: During lockdown period in 2020, internal assessment was taken through online mode using Google Forms. There were 60 'single response type' MCQs, each consisting of single stem and four options including one correct answer and three distractors. Each item was analyzed for difficulty index (Dif I), discrimination index (DI), and distractor efficiency (DE). Results: The mean of achieved marks was 42.92± (standard deviation [SD], 5.07). Dif I, DI, and DE were 47.95± (SD 16.39) in percentage, 0.12± (SD 0.10), and 18.42± (SD 15.35), respectively. 46.67% of the items were easy and 21.66% were of acceptable discrimination. Very weak negative correlation was found between Dif I and DI. Out of total 180 distractors, 51.66% were nonfunctional one. Conclusion: Item analysis and storage of MCQs with their indices provides opportunity for an examiner to select MCQs of appropriate difficulty level as per the need of assessment and decide their placement in the question paper. 
546 |a EN 
690 |a bloom's taxonomy 
690 |a difficulty index 
690 |a discrimination index 
690 |a distractor efficiency 
690 |a item analysis 
690 |a multiple-choice questions 
690 |a Public aspects of medicine 
690 |a RA1-1270 
655 7 |a article  |2 local 
786 0 |n Indian Journal of Community Medicine, Vol 47, Iss 1, Pp 92-95 (2022) 
787 0 |n http://www.ijcm.org.in/article.asp?issn=0970-0218;year=2022;volume=47;issue=1;spage=92;epage=95;aulast=Bhattacherjee 
787 0 |n https://doaj.org/toc/0970-0218 
787 0 |n https://doaj.org/toc/1998-3581 
856 4 1 |u https://doaj.org/article/96bd79e88a9047d2bfaaeed4d04cf3e7  |z Connect to this object online.