Theses and Dissertations from UMD

Permanent URI for this communityhttp://hdl.handle.net/1903/2

New submissions to the thesis/dissertation collections are added automatically as they are received from the Graduate School. Currently, the Graduate School deposits all theses and dissertations from a given semester after the official graduation date. This means that there may be up to a 4 month delay in the appearance of a give thesis/dissertation in DRUM

More information is available at Theses and Dissertations at University of Maryland Libraries.

Browse

Search Results

Now showing 1 - 8 of 8
  • Thumbnail Image
    Item
    ESSAYS ON DIGITAL ECONOMICS
    (2024) Kim, Sueyoul; Jin, Ginger Z; Economics; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    This dissertation studies economic questions in the digital environment. Specifically, it examines whether the design of a seller reward program on a livestreaming platform is optimal from the platform's revenue perspective, and how consumers' privacy concerns affect their behavior. In the first chapter, I present an empirical framework for assessing the impact of seller rewards programs on platform revenue. The context is a Korean livestreaming platform, where sellers (called streamers) broadcast content and receive tips from viewers to generate revenue. Platform revenue comes from commission charged on this revenue, and the reward is a permanent commission discount provided through performance-based monthly tournaments. I initially collect individual streamer-time level data, including efforts (measured by streaming hours), tipping revenue, and reward program acceptance. The collected data, along with anecdotal evidence, indicate that streamers exhibit heterogeneity in profitability, measured by tipping revenue per watch time. Furthermore, they tend to compete within specific broadcasting categories (e.g., within the Game category) to attract viewers. I then estimate a dynamic model to describe the effect of program design on streamers' behavior. The key trade-off for the livestreaming platform is that offering more commission discount rewards may increase the total tipping revenue by encouraging streamers---especially more profitable ones---to stream more, but it results in the platform taking a substantially smaller share of the generated tipping revenue. Counterfactual simulations reveal that the last platform share effect quantitatively dominates. This suggests that reducing the reward program by providing the reward to a smaller number of streamers or decreasing the commission discount rate would raise platform revenue. Additionally, these simulations identify opportunities to raise platform revenue by reallocating approval slots more granularly, at different broadcasting category levels instead of the entire platform level. In the second chapter, I empirically study how consumers' privacy concerns affect their behavior. Using panel survey data from South Korea that followed 5,328 individuals for four years, I find that privacy concern has a significant negative effect on their Facebook and Twitter usage. I additionally find that such concern has heterogeneous effects on online shopping behavior, while cloud storage services remain unaffected. When privacy-related events such as the Facebook-Cambridge Analytica data scandal in 2018 increases privacy concern, it appears to harm not only Facebook but also other firms in the industry (e.g., Twitter). Because a private firm does not internalize such negative spillovers, the privacy protection level determined in a free market could be different from the social optimum.
  • Thumbnail Image
    Item
    Analysis of Data Security Vulnerabilities in Deep Learning
    (2022) Fowl, Liam; Czaja, Wojciech; Goldstein, Thomas; Mathematics; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    As deep learning systems become more integrated into important application areas, the security of such systems becomes a paramount concern. Specifically, as modern networks require an increasing amount of data on which to train, the security of data that is collected for these models cannot be guaranteed. In this work, we investigate several security vulnerabilities and security applications of the data pipeline for deep learning systems. We systematically evaluate the risks and mechanisms of data security from multiple perspectives, ranging from users to large companies and third parties, and reveal several security mechanisms and vulnerabilities that are of interest to machine learning practitioners.
  • Thumbnail Image
    Item
    USERS’ PERCEPTIONS OF DATA OWNERSHIP, DATA STORAGE, AND THEIR LOCUS OF CONTROL OVER DATA GENERATED BY SMART PHONE APPLICATIONS
    (2020) Rogers, Lisa; Mazurek, Michelle; Library & Information Services; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    When users don’t understand how their data is stored, managed, and deleted in the cloud, it can leave that data vulnerable to hacking, privacy breaches, and other losses. How aware are users of what data they have in their cloud or other locations? This thesis examines how centralized remote storage affects participants’ knowledge of and ability to control and delete their phone app data using qualitative semi-structured interviews with 16 adults in the Washington, DC area. Results indicate that many users, especially Android users, don’t know what data they have backed up, and don’t feel they have control or understanding of their cloud account. Some participants thought they could better control their data if they learned more technical skills, but felt too intimidated to try. These results have implications for designing more usable cloud storage, recovery and deletion for mobile devices.
  • Thumbnail Image
    Item
    Security and Trust in Mobile Ad-Hoc Networks
    (2015) Jain, Shalabh; Baras, John S; Electrical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    Distributed ad-hoc networks have become ubiquitous in the current technological framework. Such networks have widespread applications in commercial, civil and military domains. Systems utilizing these networks are deployed in scenarios influencing critical aspects of human lives, e.g.: vehicular networks for road safety, infrastructure monitoring for smart grid or wildlife, and healthcare systems. The pervasive nature of such systems has made them a valuable target for adversarial action. The risk is compounded by the fact that typically the networks are composed of low power, unattended devices with limited protection and processing capabilities. Usage of cryptographic primitives can prove to be a significant overhead in these scenarios. Further, behavioral aspects of participants, that are critical for distributed system operation, are not effectively addressed by cryptography. In this dissertation, we explore the direction of using notions of trust and privacy to address security in these networks. In the first part of the dissertation, we consider the problems of generation, distribution and utilization of trust metrics. We adopt a cross-layer and component based view of the network protocols. We propose schemes operating at the physical layer of the communication stack, to generate trust metrics. We demonstrate that these schemes reliably detect relay adversaries in networks, and can be an effective measure of trust for the neighborhood discovery component. We propose techniques to combine trust from different detectors across multiple layers into a singular trust metric. Further, we illustrate via simulations, the advantages and disadvantages of existing techniques for propagation of local trust metrics throughout the network. We propose modifications to increase the robustness of the semiring based framework for trust propagation. Finally, we consider utilization of trust metrics to increase resilience of network protocols. We propose a distributed trust based framework, to secure routing protocols such as AODV, DSR. We highlight utility of our framework by using the proposed point-to-point link trust metrics. In the second part of the dissertation, we focus on the role of privacy in ad-hoc networks. We demonstrate that for three broad categories of systems; distributed state estimation, distributed consensus and distributed monitoring systems, privacy of context can reduce cryptographic requirements (such as the need for encryption). In fact, efficient methods to preserve privacy can significantly reduce the energy footprint of the overall security component. We define a privacy framework applicable to these scenarios, where the network can be partitioned into a hierarchical structure of critical and non-critical components. We utilize a physical layer watermarking scheme to ensure privacy guarantees in our framework. Further, for systems that lack a natural hierarchical structure, such as information fusion systems, we define an efficient framework to define a hierarchy (network partition), without leaking the structure to the adversary.
  • Thumbnail Image
    Item
    New Notions and Mechanisms for Statistical Privacy
    (2014) Groce, Adam Dowlin; Katz, Jonathan; Computer Science; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    Many large databases of personal information currently exist in the hands of corporations, nonprofits, and governments. The data in these databases could be used to answer any number of important questions, aiding in everything from basic research to day-to-day corporate decision-making. These questions must be answered while respecting the privacy of the individuals whose data are being used. However, even defining privacy in this setting can be difficult. The standard definition in the field is differential privacy. During the years since its introduction, a wide variety of query algorithms have been found that can achieve meaningful utility while at the same time protecting the privacy of individuals. However, differential privacy is a very strong definition, and in some settings it can seem too strong. Given the difficulties involved in getting differentially private output to all desirable queries, many have looked for ways to weaken differential privacy without losing its meaningful privacy guarantees. Here we discuss two such weakenings. The first is computational differential privacy, originally defined by Mironov et al. We find the promise of this weakening to be limited. We show two results that severely curtail the potential for computationally private mechanisms to add any utility over those that achieve standard differential privacy when working in the standard setting with all data held by a single entity. We then propose our own weakening, coupled-worlds privacy. This definition is meant to capture the cases where reasonable bounds can be placed on the adversary's certainty about the data (or, equivalently, the adversary's auxiliary information). We discuss the motivation for the definition, its relationship to other definitions in the literature, and its useful properties. Coupled-worlds privacy is actually a framework with which specific definitions can be instantiated, and we discuss a particular instantiation, distributional differential privacy, which we believe is of particular interest. Having introduced this definition, we then seek new distributionally differentially private query algorithms that can release useful information without the need to add noise, as is necessary when satisfying differential privacy. We show that one can release a variety of query output with distributional differential privacy, including histograms, sums, and least-squares regression lines.
  • Thumbnail Image
    Item
    Respondent Consent to Use Administrative Data
    (2012) Fulton, Jenna Anne; Presser, Stanley; Survey Methodology; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    Surveys increasingly request respondents' consent to link survey responses with administrative records. Such linked data can enhance the utility of both the survey and administrative data, yet in most cases, this linkage is contingent upon respondents' consent. With evidence of declining consent rates, there is a growing need to understand factors associated with consent to record linkage. This dissertation presents the results of three research studies that investigate factors associated with consenting. In the first study, we draw upon surveys conducted in the U.S. with consent requests to describe characteristics of surveys containing such requests, examine trends in consent rates over time, and evaluate the effects of several characteristics of the survey and consent request on consent rates. The results of this study suggest that consent rates are declining over time, and that some characteristics of the survey and consent request are associated with variations in consent rates, including survey mode, administrative record topic, personal identifier requested, and whether the consent request takes an explicit or opt-out approach. In the second study, we administered a telephone survey to examine the effect of administrative record topic on consent rates using experimental methods, and through non-experimental methods, investigated the influence of respondents' privacy, confidentiality, and trust attitudes and consent request salience on consent rates. The results of this study indicate that respondents' confidentiality attitudes are related to their consent decision; the other factors examined appear to have less of an impact on consent rates in this survey. The final study used data from the 2009 National Immunization Survey (NIS) to assess the effects of interviewers and interviewer characteristics on respondents' willingness to consent to vaccination provider contact. The results of this study suggest that interviewers vary in their ability to obtain respondents' consent, and that some interviewer characteristics are related to consent rates, including gender and amount of previous experience on the NIS.
  • Thumbnail Image
    Item
    Sharing Private Data Over Public Networks
    (2012) Baden, Randy; Bhattacharjee, Bobby; Computer Science; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    Users share their sensitive personal data with each other through public services and applications provided by third parties. Users trust application providers with their private data since they want access to provided services. However, trusting third parties with private data can be risky: providers profit by sharing that data with others regardless of the user's desires and may fail to provide the security necessary to prevent data leaks. Though users may choose between service providers, in many cases no service providers provide the desired service without being granted access to user data. Users must make a choice: forego privacy or be denied service. I demonstrate that fine-grained user privacy policies and rich services and applications are not irreconcilable. I provide technical solutions to privacy problems that protect user data using cryptography while still allowing services to operate on that data. I do this primarily through content-agnostic references to data items and user-controlled pseudonymity. I support two classes of social networking applications without trusting third parties with private data: applications which do not require data contents to provide a service, and applications that deal with data where the only private information is the binding of the data to an identity. Together, these classes of applications encompass a broad range of social networking applications.
  • Thumbnail Image
    Item
    Enhancing Privacy in Cryptographic Protocols
    (2009) Shin, Ji Sun; Gligor, Virgil D; Shankar, Udaya; Computer Science; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    For the past three decades, a wide variety of cryptographic protocols have been proposed to solve secure communication problems even in the presence of adversaries. The range of this work varies from developing basic security primitives providing confidentiality and authenticity to solving more complex, application-specific problems. However, when these protocols are deployed in practice, a significant challenge is to ensure not just security but also privacy throughout these protocols' lifetime. As computer-based devices are more widely used and the Internet is more globally accessible, new types of applications and new types of privacy threats are being introduced. In addition, user privacy (or equivalently, key privacy) is more likely to be jeopardized in large-scale distributed applications because the absence of a central authority complicates control over these applications. In this dissertation, we consider three relevant cryptographic protocols facing user privacy threats when deployed in practice. First, we consider matchmaking protocols among strangers to enhance their privacy by introducing the "durability" and "perfect forward privacy" properties. Second, we illustrate the fragility of formal definitions with respect to password privacy in the context of password-based authenticated key exchange (PAKE). In particular, we show that PAKE protocols provably meeting the existing formal definitions do not achieve the expected level of password privacy when deployed in the real world. We propose a new definition for PAKE that is tightly connected to what is actually desired in practice and suggest guidelines for realizing this definition. Finally, we answer to a specific privacy question, namely whether privacy properties of symmetric-key encryption schemes obtained by non-tight reduction proofs are retained in the real world. In particular, we use the privacy notion of "multi-key hiding" property and show its non-tight relation with the IND$-CPA property of symmetric-key schemes. We use the experimental result by Gligor et al. to show how a real attack breaks the "multi-key hiding" property of IND$-CPA symmetric-key encryption schemes with high probability in practice. Finally, we identify schemes that satisfy the "multi-key hiding" and enhance key privacy in the real world.