Theses and Dissertations from UMD

Permanent URI for this communityhttp://hdl.handle.net/1903/2

New submissions to the thesis/dissertation collections are added automatically as they are received from the Graduate School. Currently, the Graduate School deposits all theses and dissertations from a given semester after the official graduation date. This means that there may be up to a 4 month delay in the appearance of a give thesis/dissertation in DRUM

More information is available at Theses and Dissertations at University of Maryland Libraries.

Browse

Search Results

Now showing 1 - 3 of 3
  • Thumbnail Image
    Item
    Career Barriers of College Women across Racial/Ethnic Groups: Examination of The Perception of Barriers Scale
    (2018) Kim, Young Hwa; O'Brien, Karen M.; Psychology; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    The purpose of the study was to examine the factor structure, measurement invariance, and psychometric properties of a commonly used measure of perceived career barriers (The Perception of Barriers Scale; Luzzo & McWhirter, 2001) with a sample of racially diverse college women. The results supported a nine-factor structure of the Perception of Barriers Scale indicating different sources of barriers. In general, configural, metric, and scalar invariance of the Perception of Barriers subscales were found across Asian American, African American, Latina American, and White American college women for the nine-factor structure. All three groups of women of color reported higher career barriers due to racial discrimination, higher educational barriers due to finances concerns, and higher educational barriers due to lack of confidence and skills than White women. The results also demonstrated the potential difference in salient barriers across Asian American, African American, and Latina American women. The reliability estimates were satisfactory and construct validity was supported by negative associations among the scores on several Perception of Barriers subscales and a career-self-efficacy measure. The findings suggested that college women experience barriers from various sources when pursuing their career and educational goals.
  • Thumbnail Image
    Item
    Beyond Response Rates: The Effect of Prepaid Incentives on Measurement Error
    (2012) Medway, Rebecca; Tourangeau, Roger; Survey Methodology; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    As response rates continue to decline, survey researchers increasingly offer incentives as a way to motivate sample members to take part in their surveys. Extensive prior research demonstrates that prepaid incentives are an effective tool for doing so. If prepaid incentives influence behavior at the stage of deciding whether or not to participate, they also may alter the way that respondents behave while completing surveys. Nevertheless, most research has focused narrowly on the effect that incentives have on response rates. Survey researchers should have a better empirical basis for assessing the potential tradeoffs associated with the higher responses rates yielded by prepaid incentives. This dissertation describes the results of three studies aimed at expanding our understanding of the impact of prepaid incentives on measurement error. The first study explored the effect that a $5 prepaid cash incentive had on twelve indicators of respondent effort in a national telephone survey. The incentive led to significant reductions in item nonresponse and interview length. However, it had little effect on the other indicators, such as response order effects and responses to open-ended items. The second study evaluated the effect that a $5 prepaid cash incentive had on responses to sensitive questions in a mail survey of registered voters. The incentive resulted in a significant increase in the proportion of highly undesirable attitudes and behaviors to which respondents admitted and had no effect on responses to less sensitive items. While the incentive led to a general pattern of reduced nonresponse bias and increased measurement bias for the three voting items where administrative data was available for the full sample, these effects generally were not significant. The third study tested for measurement invariance in incentive and control group responses to four multi-item scales from three recent surveys that included prepaid incentive experiments. There was no evidence of differential item functioning; however, full metric invariance could not be established for one of the scales. Generally, these results suggest that prepaid incentives had minimal impact on measurement error. Thus, these findings should be reassuring for survey researchers considering the use of prepaid incentives to increase response rates.
  • Thumbnail Image
    Item
    Testing for Differentially Functioning Indicators Using Mixtures of Confirmatory Factor Analysis Models
    (2009) Mann, Heather Marie; Hancock, Gregory R; Measurement, Statistics and Evaluation; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    Heterogeneity in measurement model parameters across known groups can be modeled and tested using multigroup confirmatory factor analysis (CFA). When it is not reasonable to assume that parameters are homogeneous for all observations in a manifest group, mixture CFA models are appropriate. Mixture CFA models can add theoretically important unmeasured characteristics to capture heterogeneity and have the potential to be used to test measurement invariance. The current study investigated the ability of mixture CFA models to identify differences in factor loadings across latent classes when there is no mean separation in both the latent and measured variables. Using simulated data from models with known parameters, parameter recovery, classification accuracy, and the power of the likelihood-ratio test were evaluated as impacted by model complexity, sample size, latent class proportions, magnitude of factor loading differences, percentage of noninvariant factor loadings, and pattern of noninvariant factor loadings. Results suggested that mixture CFA models may be a viable option for testing the invariance of measurement model parameters, but without impact and differences in measurement intercepts, larger sample sizes, more noninvariant factor loadings, and larger amounts of heterogeneity are needed to distinguish different latent classes and successfully estimate their parameters.