A. James Clark School of Engineering
Permanent URI for this communityhttp://hdl.handle.net/1903/1654
The collections in this community comprise faculty research works, as well as graduate theses and dissertations.
Browse
3 results
Search Results
Item Trajectory Planning for Autonomous Vehicles Performing Information Gathering Tasks(2018) Kuhlman, Michael Joseph; Gupta, Satyandra K; Mechanical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)This dissertation investigates mission scenarios for autonomous vehicles in which the objective is to gather information. This includes minimizing uncertainty of a target's estimated location, generating coverage plans to cover an area, or persistent monitoring tasks such as generating informative patrol routes. Information gathering tasks cannot be solved with shortest path planning algorithms since the rewards are path-dependent. Further, in order to deploy such algorithms effectively in the real world, generated plans must safely avoid obstacles, account for the motion uncertainty (e.g. due to swift currents), and constraints on the vehicle's dynamics such as maximum speed/acceleration. This work extends state-of-the-art information gathering algorithms by generating dynamically feasible trajectories for autonomous vehicles that are able to exploit the environment to find higher quality solutions, reducing mission costs. We also reduce mission risk without sacrificing the amount of information gathered. The focus of this dissertation will be to solve three related information gathering tasks that require generating dynamically feasible trajectories for reliable plan execution. When searching for targets, minimizing target location uncertainty with autonomous vehicles improves the effectiveness of ground relief crews. We investigate the use of mutual information for efficiently generating long duration multi-pass trajectories to minimize target location uncertainty in natural environments. We develop epsilon-admissible heuristics to create the epsilon-admissible Branch and Bound algorithm to gather the most information. Next, we investigate coordination techniques for underwater vehicle teams conducting large-scale geospatial tasks such as adaptive sampling or coverage planning. It is advantageous to exploit the currents of the ocean to increase endurance, which requires accounting for forecast uncertainty. We adapt Monte Carlo Tree Search and Cross Entropy Method to maximize path-dependent reward, and introduce an iterative greedy solver that outperforms state-of-the-art planners. Finally, we investigate persistent monitoring tasks such as sentry patrol routes and monitoring of harmful algae blooms in a littoral environment. Such an automated planner needs to generate collision-free coverage paths by moving waypoints to locations that both minimize path traversal costs and maximize the amount of information gathered along the path. We extend previous Lloyd's based algorithms by factoring in the ocean currents and introduce greedy methods that minimize mission risk while maximizing information gathered.Item Fundamental Limits in Multimedia Forensics and Anti-forensics(2015) Chu, Xiaoyu; Liu, K. J. Ray; Electrical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)As the use of multimedia editing tools increases, people become questioning the authenticity of multimedia content. This is specially a big concern for authorities, such as law enforcement, news reporter and government, who constantly use multimedia evidence to make critical decisions. To verify the authenticity of multimedia content, many forensic techniques have been proposed to identify the processing history of multimedia content under question. However, as new technologies emerge and more complicated scenarios are considered, the limitation of multimedia forensics has been gradually realized by forensic researchers. It is the inevitable trend in multimedia forensics to explore the fundamental limits. In this dissertation, we propose several theoretical frameworks to study the fundamental limits in various forensic problems. Specifically, we begin by developing empirical forensic techniques to deal with the limitation of existing techniques due to the emergence of new technology, compressive sensing. Then, we go one step further to explore the fundamental limit of forensic performance. Two types of forensic problems have been examined. In operation forensics, we propose an information theoretical framework and define forensicability as the maximum information features contain about hypotheses of processing histories. Based on this framework, we have found the maximum number of JPEG compressions one can detect. In order forensics, an information theoretical criterion is proposed to determine when we can and cannot detect the order of manipulation operations that have been applied on multimedia content. Additionally, we have examined the fundamental tradeoffs in multimedia antiforensics, where attacking techniques are developed by forgers to conceal manipulation fingerprints and confuse forensic investigations. In this field, we have defined concealability as the effectiveness of anti-forensics concealing manipulation fingerprints. Then, a tradeoff between concealability, rate and distortion is proposed and characterized for compression anti-forensics, which provides us valuable insights of how forgers may behave under their best strategy.Item Common Randomness Principles of Secrecy(2013) Tyagi, Himanshu; Narayan, Prakash; Electrical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)This dissertation concerns the secure processing of distributed data by multi- ple terminals, using interactive public communication among themselves, in order to accomplish a given computational task. In the setting of a probabilistic multitermi- nal source model in which several terminals observe correlated random signals, we analyze secure distributed data processing protocols that harness the correlation in the data. The specific tasks considered are: computing functions of the data under secrecy requirements; generating secretly shared bits with minimal rate of public communication; and securely sharing bits in presence of a querying eavesdropper. In studying these various secure distributed processing tasks, we adopt a unified approach that entails examining the form of underlying common randomness (CR) that is generated at the terminals during distributed processing. We make the case that the exact form of established CR is linked inherently to the data processing task at hand, and its characterization can lead to a structural understanding of the associated algorithms. An identification of the underlying CR and its decomposi- tion into independent components, each with a different operational significance, is a recurring fundamental theme at the heart of all the proofs in this dissertation. In addition to leading to new theoretical insights, it brings out equivalences between seemingly unrelated problems. Another distinguishing feature of this work is that it considers interactive communication protocols. In fact, understanding the structure of such interactive communication is a key step in proving our results. We make the following contributions. First, we propose a new information theoretic formulation to study secure distributed computing using public communi- cation. The parties observing distributed data are trusted but an eavesdropper has access to the public communication network. We examine distributed communica- tion protocols that allow the trusted parties to accomplish their required computa- tion tasks while giving away negligible information about a specified portion of the data to an eavesdropper with access to the communication. Our theoretical results provide necessary and sufficient conditions that characterize the feasibility of vari- ous secure computing tasks; in many cases of practical importance, these conditions take a simple form and can be verified easily. When secure computing is feasible, we propose new algorithms in special cases. Next, we revisit the problem of generating shared secret keys (SKs). We investigate minimum communication requirements for generating information theo- retically secure SKs of maximum rates from correlated observations using interactive public communication. In particular, our approach allows us to examine the role of interaction in such communication. On the one hand, we find that interaction is not needed when the observed correlated bits are symmetrically correlated and therefore, in this case, simple noninteractive protocols are the most efficient means of generating optimum rate SKs. On the other hand, we illustrate that interactive pro- tocols can require a strictly lower rate of overall communication than noninteractive protocols. Finally, we consider the task of ensuring security against an eavesdropper who makes queries about a portion of the distributed data that the terminals share by communicating over a public network. We introduce an alternative notion of secrecy which requires rendering the task of a querying eavesdropper as onerous as possible. Our main contribution in this part is the development of a new technique for proving converse results for secrecy problems involving CR with interactive communication, which is employed then to obtain an upper bound for the maximum number of queries that can be inflicted on the eavesdropper for any CR and corresponding communication. Surprisingly, there is an equivalence between this notion of secrecy and that of information theoretic security, which leads to new theoretical results for SK generation; for instance, we prove a strong converse for the SK capacity. We conclude by hypothesizing the basic principles of secrecy generation that emerge from the results developed in this dissertation.