Human Development & Quantitative Methodology
Permanent URI for this communityhttp://hdl.handle.net/1903/2248
The departments within the College of Education were reorganized and renamed as of July 1, 2011. This department incorporates the former departments of Measurement, Statistics & Evaluation; Human Development; and the Institute for Child Study.
Browse
2 results
Search Results
Item A MIXED-STRATEGIES RASCH TESTLET MODEL FOR LOW-STAKES TESTLET-BASED ASSESSMENTS(2013) Chen, Ying-Fang; Jiao, Hong; Measurement, Statistics and Evaluation; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)In low-stakes assessments, a lack of test-taking motivation inevitably occurs because test scores impose inconsequential effects on test takers' academic records. A common occurrence is that some test takers are unmotivated and simply apply random guessing strategy rather than solution strategy in taking a test. Testlet effects also arise because educational assessment items are frequently written in testlet units. A challenge to psychometric measurement is that conventional item response theory models do not sufficiently account for test-taking motivation heterogeneity and testlet effects. These construct-irrelevant variances affect test validity, accuracy of parameter estimates, and targeted inferences. This study proposes a low-stakes assessment measurement model that can simultaneously explain test-taking motivation heterogeneity and testlet effects. The performance and effectiveness of the proposed model are evaluated through a simulation study. Its utility is demonstrated through an application to a real standardized low-stakes assessment dataset. Simulation results show that overlooking test-taking motivation heterogeneity and testlet effects adversely affected model-data fit and model parameter estimates. The proposed model improved model-data fit and classification accuracy and well recovered model parameters under test-taking motivation heterogeneity and testlet effects. For the real data application, the item response dataset, which was originally calibrated with the Rasch model, was fitted better by the proposed model. Both test-taking motivation heterogeneity and testlet effects were identified in the real dataset. Finally, a set of variables selected from the real dataset is used to explore potential factors that characterize the latent classes of test-taking motivation. In the science assessment, science proficiency was associated with test-taking motivation heterogeneity.Item A MIXTURE RASCH MODEL WITH A COVARIATE:A SIMULATION STUDY VIA BAYESIAN MARKOV CHAIN MONTE CARLO ESTIMATION(2009) Dai, Yunyun; Mislevy, Robert J; Measurement, Statistics and Evaluation; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Mixtures of item response theory models have been proposed as a technique to explore response patterns in test data related to cognitive strategies, instructional sensitivity, and differential item functioning (DIF). Estimation proves challenging due to difficulties in identification and questions of effect size needed to recover underlying structures. In particular, the impact of auxiliary variables, or covariates, for examinees in estimation has not been systematically explored. The goal of this dissertation is to carry out a systematically designed simulation study to investigate the performance of mixture Rasch model (MRM) under Bayesian estimation using Markov Chain Monte Carlo (MCMC) method. The dependent variables in this study are (1) the proportion of cases in which the generating mixture structure is recovered, and (2) among those cases in which the structure is recovered, the bias and root mean squared error of parameter estimates. The foci of the study are to use a flexible logistic regression model to parameterize the relation between latent class membership and the examinee covariate, to study MCMC estimation behavior in light of effect size, and to provide insights and suggestions on model application and model estimation.