Example: Item analysis
Below is a sample item analysis that shows the summary table of statistics for all items (questions) for a multiple-choice classroom exam. Review the item difficulty (P), discrimination (R(IT)), and distractors (options B-E).
Item analysis (sample of 10 items) – correct answer is “A”
- Which item(s) could you remove altogether from the exam? Why?
- Which distractor(s) would you revise? Why?
- Which items are working well?
By exploring item difficulty, items 2, 4, and 5 appear extremely easy (P-value >.90), and items 8 and 9 appear extremely difficult (P-value <. 20).
By exploring item discrimination, items 4, 6, 8, 9, and possibly 5 appear to indicate poor discriminating items (R(IT) values near or less than zero). This indicates that students who overall did poorly on the exam did better on these questions than students who overall did well.
Therefore, by combining the results from these two investigations, items 4, 5, 8, and 9 appear to be the best items to delete from this exam. Please review deleting item process before deleting items from your own exam.
- Items 2, 4, and 5 appear to possess distractors selected by a few or no students, and items 8 and 9 appear to possess distractors selected by as many or more students than the correct answer. Therefore, these item distractors should either be removed or revised, especially if the distractors are the source of students’ confusion (rather than the item itself).
- Items that appear to be working well (i.e. good difficulty, discrimination, and distractors) are items, 1, 2, 3, and possibly 7 and 10 depending upon the purpose of the exam.