UMD Theses and Dissertations

Permanent URI for this collectionhttp://hdl.handle.net/1903/3

New submissions to the thesis/dissertation collections are added automatically as they are received from the Graduate School. Currently, the Graduate School deposits all theses and dissertations from a given semester after the official graduation date. This means that there may be up to a 4 month delay in the appearance of a given thesis/dissertation in DRUM.

More information is available at Theses and Dissertations at University of Maryland Libraries.

Browse

Search Results

Now showing 1 - 4 of 4
  • Thumbnail Image
    Item
    Geometric and Topological Reconstruction
    (2022) Rawson, Michael G.; Balan, Radu; Robinson, Michael; Mathematics; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    The understanding of mathematical signals is responsible for the information age. Computation, communication, and storage by computers all use signals, either implicitly or explicitly, and use mathematics to manipulate those signals. Reconstruction of a particular signal can be desirable or even necessary depending on how the signal manifests and is measured. We explore how to use mathematical ideas to manipulate and represent signals. Given measurements or samples or data, we analyze how to produce, or \emph{reconstruct}, the desired signal and the fundamental limits in doing so. We focus on reconstruction through a geometric and topological lens so that we can leverage geometric and topological constraints to solve the problems. As inaccuracies and noise are present in every computation, we adopt a statistical outlook and prove results with high probability given noise. We start off with probability and statistics and then use that for active reconstruction where the probability signal needs to be estimated statistically from sampling various sources. We prove optimal ways to doing this even in the most challenging of situations. Then we discuss functional analysis and how to reconstruct sparse rank one decompositions of operators. We prove optimality of certain matrix classes, based on geometry, and compute the worst case via sampling distributions. With the mathematical tools of functional analysis, we introduce the optimal transportation problem. Then we can use the Wasserstein metric and its geometry to provably reconstruct sparse signals with added noise. We devise an algorithm to solve this optimization problem and confirm its ability on both simulated data and real data. Heavily under-sampled data can be ill-posed which is often the case with magnetic resonance imaging data. We leverage the geometry of the motion correction problem to devise an appropriate approximation with a bound. Then we implement and confirm in simulation and on real data. Topology constraints are often present in non-obvious ways but can often be detected with persistent homology. We introduce the barcode algorithm and devise a method to parallelize it to allow analyzing large datasets. We prove the parallelization speedup and use it for natural language processing. We use topology constraints to reconstruct word-sense signals. Persistent homology is dependent on the data manifold, if it exists. And it is dependent on the manifold's reach. We calculate manifold reach and prove the instability of the formulation. We introduce the combinatorial reach to generalize reach and we prove the combinatorial reach is stable. We confirm this in simulation. Unfortunately, reach and persistent homology are not an invariant of hypergraphs. We discuss hypergraphs and how they can partially reconstruct joint distributions. We define a hypergraph and prove its ability to distinguish certain joint distributions. We give an approximation and prove its convergence. Then we confirm our results in simulation and prove its usefulness on a real dataset.
  • Thumbnail Image
    Item
    Development of a CAD Model Simplification Framework for Finite Element Analysis
    (2012) Russ, Brian Henry; Gupta, Satyandra K; Mechanical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    Analyzing complex 3D models using finite element analysis software requires suppressing features/parts that are not likely to influence the analysis results, but may significantly improve the computational performance both in terms of mesh size and mesh quality. The suppression step often depends on the context and application. Currently, most analysts perform this step manually. This step can take a long time to perform on a complex model and can be tedious in nature. The goal of this thesis was to generate a simplification framework for both part and assembly CAD models for finite element analysis model preparation. At the part level, a rule-based approach for suppressing holes, rounds, and chamfers is presented. Then a tool for suppressing multiple specified part models at once is described at the assembly level. Upon discussion of the frameworks, the tools are demonstrated on several different models to show the complete approach and the computational performances. The work presented in this thesis is expected to significantly reduce the manual time consuming activities within the model simplification stage. This is accomplished through multiple feature/part suppression compared to the industry standard of suppressing one feature/part at a time. A simplified model speeds up the overall analysis, reducing the meshing time and calculation of the analysis values, while maintaining and on occasion improving the quality of the analysis.
  • Thumbnail Image
    Item
    The Electronic Works of György Ligeti and their Influence on his Later Style
    (2006-05-02) Levy, Benjamin Robert; DeLio, Thomas; Music; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    This dissertation, entitled The Electronic Works of György Ligeti and Their Influence on his Later Style investigates the connections between the composer's pieces for electronic tape written in the late 1950s and the instrumental music he composed thereafter. There are numerous reasons to suspect such a chain of influence, including suggestive comments Ligeti has made in interviews. Moreover, these works, Glissandi (1957), Artikulation (1958), and the uncompleted Pièce électronique no. 3 (1957-58), were written at a critical point in the composer's career, falling between two major stylistic periods. Before he fled Hungary in December 1956 his compositions were influenced by Bartóok, but his orchestral pieces Apparitions (1958-59) and Atmosphères (1961) were much celebrated for their strikingly original textures and timbres. While these orchestral pieces secured Ligeti's reputation as an important avant-garde figure, the first works he composed in the West were the electronic pieces, which have suffered relative neglect. There are difficulties inherent in analyzing electronic music, and thus the first chapter of this dissertation focuses on theoretical literature in this growing field, including discussion of musical timbre, different means of notation, and in particular, the work of theorist Robert Cogan. Chapters 2 and 3 are analytical studies of Ligeti's finished tape piece, using spectrographs and information from Ligeti's sketches to focus on the use of sonic material in the construction of form. Additionally each study is put in the context of Ligeti's contemporaries, composers such as Karlheinz Stockhausen and Gottfried Michael Koenig, as well as figures such as the philosopher T.W. Adorno. The fourth and final chapter focuses on the historical chain of influence and examines some of Ligeti's instrumental music, particularly Apparitions, in light of the their electronic precedents. These examples illuminate connections between the electronic and instrumental, ranging from the slightest nuances in individual gestures-many of which are translated directly from one medium to the other-to methods of constructing entire forms, which continue to appear throughout Ligeti's oeuvre; thus, the final aim of this dissertation is to provide groundwork for further studies which will deepen the understanding of other works by this innovative composer.
  • Thumbnail Image
    Item
    Dwarf: A Complete System for Analyzing High-Dimensional Data Sets
    (2004-08-23) Sismanis, John; Roussopoulos, Nick; Computer Science; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    The need for data analysis by different industries, including telecommunications, retail, manufacturing and financial services, has generated a flurry of research, highly sophisticated methods and commercial products. However, all of the current attempts are haunted by the so-called "high-dimensionality curse"; the complexity of space and time increases exponentially with the number of analysis "dimensions". This means that all existing approaches are limited only to coarse levels of analysis and/or to approximate answers with reduced precision. As the need for detailed analysis keeps increasing, along with the volume and the detail of the data that is stored, these approaches are very quickly rendered unusable. I have developed a unique method for efficiently performing analysis that is not affected by the high-dimensionality of data and scales only polynomially -and almost linearly- with the dimensions without sacrificing any accuracy in the returned results. I have implemented a complete system (called "Dwarf") and performed an extensive experimental evaluation that demonstrated tremendous improvements over existing methods for all aspects of performing analysis -initial computation, storing, querying and updating it. I have extended my research to the "data-streaming" model where updates are performed on-line, exacerbating any concurrent analysis but has a very high impact on applications like security, network management/monitoring router traffic control and sensor networks. I have devised streaming algorithms that provide complex statistics within user-specified relative-error bounds over a data stream. I introduced the class of "distinct implicated statistics", which is much more general than the established class of "distinct count" statistics. The latter has been proved invaluable in applications such as analyzing and monitoring the distinct count of species in a population or even in query optimization. The "distinct implicated statistics" class provides invaluable information about the correlations in the stream and is necessary for applications such as security. My algorithms are designed to use bounded amounts of memory and processing -so that they can even be implemented in hardware for resource-limited environments such as network-routers or sensors- and also to work in "noisy" environments, where some data may be flawed either implicitly due to the extraction process or explicitly.