Theses and Dissertations from UMD

Permanent URI for this communityhttp://hdl.handle.net/1903/2

New submissions to the thesis/dissertation collections are added automatically as they are received from the Graduate School. Currently, the Graduate School deposits all theses and dissertations from a given semester after the official graduation date. This means that there may be up to a 4 month delay in the appearance of a give thesis/dissertation in DRUM

More information is available at Theses and Dissertations at University of Maryland Libraries.

Browse

Search Results

Now showing 1 - 2 of 2
  • Thumbnail Image
    Item
    Enhancing the Understanding of the Relationship between Social Integration and Nonresponse in Household Surveys
    (2015) Amaya, Ashley Elaine; Presser, Stanley; Survey Methodology; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    Nonresponse and nonresponse bias remain fundamental concerns for survey researchers as understanding them is critical to producing accurate statistics. This dissertation tests the relationship between social integration, nonresponse, and nonresponse bias. Using the rich frame information available on the American Time Use Survey (ATUS) and the Survey of Health, Ageing, and Retirement in Europe (SHARE) Wave II, structural equation models were employed to create latent indicators of social integration. The resulting variables were used to predict nonresponse and its components (e.g., noncontact). In both surveys, social integration was significantly predictive of nonresponse (regardless of type of nonresponse) with integrated individuals more likely to respond. However, the relationship was driven by different components of integration across the two surveys. Full sample estimates were compared to respondent estimates on a series of 40 dichotomous and categorical variables to test the hypothesis that variables measuring social activities and roles would suffer from nonresponse bias. The impact of nonresponse on multivariate models predicting social outcomes was also evaluated. Nearly all of the 40 assessed variables suffered from significant nonresponse bias resulting in the overestimation of social activity and role participation. In general, civic and political variables suffered from higher levels of bias, but the differences were not significant. Multivariate models were not exempt; beta coefficients were frequently biased. Although, the direction was inconsistent and often small. Finally, an indicator of social integration was added to the weighting methodology with the goal of eliminating the observed nonresponse bias. While the addition significantly reduced the bias in most instances compared to both the base- and traditionally-weighted estimates, the improvements were small and did little to eliminate the bias.
  • Thumbnail Image
    Item
    NONPARTICIPATION OF THE 12TH GRADERS IN THE NATIONAL ASSESSMENT OF EDUCATIONAL PROGRESS: UNDERSTANDING DETERMINANTS OF NONRESPONSE AND ASSESSING THE IMPACT ON NAEP ESTIMATES OF NONRESPONSE BIAS ACCORDING TO PROPENSITY MODELS
    (2009) Chun, Young I.; Abraham, Katharine; Robinson, John; Sociology; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    This dissertation examines nonparticipation of 12th graders in the year 2000 National Assessment of Educational Progress (NAEP), using a model of nonresponse developed by Groves and Couper (1998). NAEP is a continuing assessment of American student knowledge in various subject areas including mathematics and science, and the possibility that its results could be contaminated by a low response rate was taken as very serious. The dissertation evaluates the statistical impact of nonparticipation bias on estimates of educational performance in NAEP, by applying response propensity models to the NAEP mathematics and science survey data and the corresponding school administrative data from over 20,000 seniors in the 2000 High School Transcript Study (HSTS). When NAEP and HSTS are merged, one has measures of individual- and school-level characteristics for nonparticipants as well as participants. Results indicate that nonresponse was not a serious contaminant, and applying response propensity based weights led to only about a 1-point difference out on average of 500 points in mathematics and of 300 points in science. The results support other recent research (e.g., Curtin, Press and Singer, 2000; Groves, 2006) showing minimal effects on nonresponse bias of lowered response rates.