Computer Science Theses and Dissertations

Permanent URI for this collectionhttp://hdl.handle.net/1903/2756

Browse

Search Results

Now showing 1 - 3 of 3
  • Thumbnail Image
    Item
    New Notions and Mechanisms for Statistical Privacy
    (2014) Groce, Adam Dowlin; Katz, Jonathan; Computer Science; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    Many large databases of personal information currently exist in the hands of corporations, nonprofits, and governments. The data in these databases could be used to answer any number of important questions, aiding in everything from basic research to day-to-day corporate decision-making. These questions must be answered while respecting the privacy of the individuals whose data are being used. However, even defining privacy in this setting can be difficult. The standard definition in the field is differential privacy. During the years since its introduction, a wide variety of query algorithms have been found that can achieve meaningful utility while at the same time protecting the privacy of individuals. However, differential privacy is a very strong definition, and in some settings it can seem too strong. Given the difficulties involved in getting differentially private output to all desirable queries, many have looked for ways to weaken differential privacy without losing its meaningful privacy guarantees. Here we discuss two such weakenings. The first is computational differential privacy, originally defined by Mironov et al. We find the promise of this weakening to be limited. We show two results that severely curtail the potential for computationally private mechanisms to add any utility over those that achieve standard differential privacy when working in the standard setting with all data held by a single entity. We then propose our own weakening, coupled-worlds privacy. This definition is meant to capture the cases where reasonable bounds can be placed on the adversary's certainty about the data (or, equivalently, the adversary's auxiliary information). We discuss the motivation for the definition, its relationship to other definitions in the literature, and its useful properties. Coupled-worlds privacy is actually a framework with which specific definitions can be instantiated, and we discuss a particular instantiation, distributional differential privacy, which we believe is of particular interest. Having introduced this definition, we then seek new distributionally differentially private query algorithms that can release useful information without the need to add noise, as is necessary when satisfying differential privacy. We show that one can release a variety of query output with distributional differential privacy, including histograms, sums, and least-squares regression lines.
  • Thumbnail Image
    Item
    Sharing Private Data Over Public Networks
    (2012) Baden, Randy; Bhattacharjee, Bobby; Computer Science; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    Users share their sensitive personal data with each other through public services and applications provided by third parties. Users trust application providers with their private data since they want access to provided services. However, trusting third parties with private data can be risky: providers profit by sharing that data with others regardless of the user's desires and may fail to provide the security necessary to prevent data leaks. Though users may choose between service providers, in many cases no service providers provide the desired service without being granted access to user data. Users must make a choice: forego privacy or be denied service. I demonstrate that fine-grained user privacy policies and rich services and applications are not irreconcilable. I provide technical solutions to privacy problems that protect user data using cryptography while still allowing services to operate on that data. I do this primarily through content-agnostic references to data items and user-controlled pseudonymity. I support two classes of social networking applications without trusting third parties with private data: applications which do not require data contents to provide a service, and applications that deal with data where the only private information is the binding of the data to an identity. Together, these classes of applications encompass a broad range of social networking applications.
  • Thumbnail Image
    Item
    Enhancing Privacy in Cryptographic Protocols
    (2009) Shin, Ji Sun; Gligor, Virgil D; Shankar, Udaya; Computer Science; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    For the past three decades, a wide variety of cryptographic protocols have been proposed to solve secure communication problems even in the presence of adversaries. The range of this work varies from developing basic security primitives providing confidentiality and authenticity to solving more complex, application-specific problems. However, when these protocols are deployed in practice, a significant challenge is to ensure not just security but also privacy throughout these protocols' lifetime. As computer-based devices are more widely used and the Internet is more globally accessible, new types of applications and new types of privacy threats are being introduced. In addition, user privacy (or equivalently, key privacy) is more likely to be jeopardized in large-scale distributed applications because the absence of a central authority complicates control over these applications. In this dissertation, we consider three relevant cryptographic protocols facing user privacy threats when deployed in practice. First, we consider matchmaking protocols among strangers to enhance their privacy by introducing the "durability" and "perfect forward privacy" properties. Second, we illustrate the fragility of formal definitions with respect to password privacy in the context of password-based authenticated key exchange (PAKE). In particular, we show that PAKE protocols provably meeting the existing formal definitions do not achieve the expected level of password privacy when deployed in the real world. We propose a new definition for PAKE that is tightly connected to what is actually desired in practice and suggest guidelines for realizing this definition. Finally, we answer to a specific privacy question, namely whether privacy properties of symmetric-key encryption schemes obtained by non-tight reduction proofs are retained in the real world. In particular, we use the privacy notion of "multi-key hiding" property and show its non-tight relation with the IND$-CPA property of symmetric-key schemes. We use the experimental result by Gligor et al. to show how a real attack breaks the "multi-key hiding" property of IND$-CPA symmetric-key encryption schemes with high probability in practice. Finally, we identify schemes that satisfy the "multi-key hiding" and enhance key privacy in the real world.