Joint Program in Survey Methodology

Permanent URI for this communityhttp://hdl.handle.net/1903/2251

Browse

Search Results

Now showing 1 - 2 of 2
  • Thumbnail Image
    Item
    Nonparticipation Issues Related to Passive Data Collection
    (2024) Breslin, Alexandra Marie Brown; Presser, Stanley; Antoun, Chris; Survey Methodology; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    New passive data collection techniques on smartphones allow for the direct observation of a participant’s behavior and environment in place of self-reported information. However, such studies are not appealing to all people, especially those with higher security concerns. The current work explores the mechanisms that impact a sample member’s decision to participate in a passive data collection using three different online panels. The first study explores nonparticipation bias in a financial tracking study and finds evidence of bias in the self-reported measures of financial behaviors, and that prior experience with the research organization positively impacts a sample member’s decision to participate. Studies two and three employ deception studies (i.e., the passive data collections were presented as real rather than hypothetical, but no data was passively collected) in which respondents received experimentally varied invitations to participate in a smartphone-based passive data collection. The second study varies the type of data requested and the study topic to understand better how these study components interact. The findings suggest that the type of data requested impacts participation while the study topic does not. The second study utilized video messages presented to all sample members who chose not to participate. These videos asked the sample member to reconsider, varying whether or not they reiterated the data and security measures of the study from the initial invitation. The results suggest that offering a follow-up video increased participation. Finally, the third study experimentally varied the level of control the sample member would have over what data is shared with researchers during a passive data collection. The findings suggest that an offer of control may not increase participation in app-based passive data collection. The three studies suggest that sample members are more likely to participate in a survey when they have prior experience with such a request and may be converted to participate with a video message, but that the type of data requested greatly impacts the decision to participate. Future work should include replicating these studies with different requested data types and shifting to samples not drawn from online panels.
  • Thumbnail Image
    Item
    Design and Effectiveness of Multimodal Definitions in Online Surveys
    (2020) Spiegelman, Maura; Conrad, Frederick G; Survey Methodology; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    If survey respondents do not interpret a question as it was intended, they may, in effect, answer the wrong question, increasing the chances of inaccurate data. Researchers can bring respondents’ interpretations into alignment with what is intended by defining the terms that respondents might misunderstand. This dissertation explores strategies to increase response alignment with definitions in online surveys. In particular, I compare the impact of unimodal (either spoken or textual) to multimodal (both spoken and textual) definitions on question interpretation and, indirectly, response quality. These definitions can be further categorized as conventional or optimized for the mode in which they are presented (for textual definitions, fewer words than in conventional definitions with key information made visually salient and easier for respondents to grasp; for spoken definitions, a shorter, more colloquial style of speaking). The effectiveness of conventional and optimized definitions are compared, as well as the effectiveness of unimodal and multimodal definitions. Amazon MTurk workers were randomly assigned to one of six definition conditions in a 2x3 design: conventional or optimized definitions, presented in a spoken, textual, or multimodal (both spoken and textual) format. While responses for unimodal optimized and conventional definitions were similar, multimodal definitions, and particularly multimodal optimized definitions, resulted in responses with greater alignment with definitions. Although complementary information presented in different modes can increase comprehension and lead to increased data quality, redundant or otherwise untailored multimodal information may not have the same positive effects. Even as not all respondents complied with instructions to read and/or listen to definitions, the compliance rates and effectiveness of multimodal presentation were sufficiently high to show improvements in data quality, and the effectiveness of multimodal definitions increased when only compliant observations were considered. Multimodal communication in a typically visual medium (such as web surveys) may increase the amount of time needed to complete a questionnaire, but respondents did not consider their use to be burdensome or otherwise unsatisfactory. While further techniques could be used to help increase respondent compliance with instructions, this study suggests that multimodal definitions, when thoughtfully designed, can improve data quality without negatively impacting respondents.