ITEM-ANALYSIS METHODS AND THEIR IMPLICATIONS FOR THE ILTA GUIDELINES FOR PRACTICE: A COMPARISON OF THE EFFECTS OF CLASSICAL TEST THEORY AND ITEM RESPONSE THEORY MODELS ON THE OUTCOME OF A HIGH-STAKES ENTRANCE EXAM

dc.contributor.advisorRoss, Steven Jen_US
dc.contributor.authorEllis, David P.en_US
dc.contributor.departmentSecond Language Acquisition and Applicationen_US
dc.contributor.publisherDigital Repository at the University of Marylanden_US
dc.contributor.publisherUniversity of Maryland (College Park, Md.)en_US
dc.date.accessioned2012-02-17T07:07:40Z
dc.date.available2012-02-17T07:07:40Z
dc.date.issued2011en_US
dc.description.abstractThe current version of the International Language Testing Association (ILTA) Guidelines for Practice requires language testers to pretest items before including them on an exam, or when pretesting is not possible, to conduct post-hoc item analysis to ensure any malfunctioning items are excluded from scoring. However, the guidelines are devoid of guidance with respect to which item-analysis method is appropriate for any given examination. The purpose of this study is to determine what influence choice of item-analysis method has on the outcome of a high-stakes university entrance exam. Two types of classical-test-theory (CTT) item analysis and three item-response-theory (IRT) models were applied to responses generated from a single administration of a 70-item dichotomously scored multiple-choice test of English proficiency, administered to 2,320 examinees applying to a prestigious private university in western Japan. Results illustrate that choice of item-analysis method greatly influences the ordinal ranking of examinees. The implications of these findings are discussed and recommendations are made for revising the ILTA Guidelines for Practice to delineate more explicitly how language testers should apply item analysis in their testing practice.en_US
dc.identifier.urihttp://hdl.handle.net/1903/12367
dc.subject.pqcontrolledEducational tests & measurementsen_US
dc.subject.pqcontrolledForeign language instructionen_US
dc.subject.pquncontrolledCTTen_US
dc.subject.pquncontrolledforeign languageen_US
dc.subject.pquncontrolledhigh-stakesen_US
dc.subject.pquncontrolledILTAen_US
dc.subject.pquncontrolledIRTen_US
dc.subject.pquncontrolleditem analysisen_US
dc.titleITEM-ANALYSIS METHODS AND THEIR IMPLICATIONS FOR THE ILTA GUIDELINES FOR PRACTICE: A COMPARISON OF THE EFFECTS OF CLASSICAL TEST THEORY AND ITEM RESPONSE THEORY MODELS ON THE OUTCOME OF A HIGH-STAKES ENTRANCE EXAMen_US
dc.typeDissertationen_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Ellis_umd_0117E_12824.pdf
Size:
6.73 MB
Format:
Adobe Portable Document Format