UMD Theses and Dissertations
Permanent URI for this collectionhttp://hdl.handle.net/1903/3
New submissions to the thesis/dissertation collections are added automatically as they are received from the Graduate School. Currently, the Graduate School deposits all theses and dissertations from a given semester after the official graduation date. This means that there may be up to a 4 month delay in the appearance of a given thesis/dissertation in DRUM.
More information is available at Theses and Dissertations at University of Maryland Libraries.
Browse
3 results
Search Results
Item U(R) PHASE RETRIEVAL, LOCAL NORMALIZING FLOWS, AND HIGHER ORDER FOURIER TRANSFORMS(2022) Dock, Christopher Barton; Balan, Radu; Applied Mathematics and Scientific Computation; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)The classical phase retrieval problem arises in contexts ranging from speech recognition to x-ray crystallography and quantum state tomography. The generalization to matrix frames is natural in the sense that it corresponds to quantum tomography of impure states. Chapter 1 provides computable global stability bounds for the quasi-linear analysis map $\beta$ and a path forward for understanding related problems in terms of the differential geometry of key spaces. In particular, Chapter 1 manifests a Whitney stratification of the positive semidefinite matrices of low rank which allows us to ``stratify'' the computation of the global stability bound. We show that for the impure state case no such global stability bounds can be obtained for the non-linear analysis map $\alpha$ with respect to certain natural distance metrics. Finally, our computation of the global lower Lipschitz constant for the $\beta$ analysis map provides novel conditions for a frame to be generalized phase retrievable. In Chapter 2 we develop the concept of local normalizing flows. Normalizing flows provide an elegant approach to generative modeling that allows for efficient sampling and exact density evaluation of unknown data distributions. However, current techniques have significant limitations in their expressivity when the data distribution is supported on a low-dimensional manifold or has a non-trivial topology. We introduce a novel statistical framework for learning a mixture of local normalizing flows as ``chart maps'' over the data manifold. Our framework augments the expressivity of recent approaches while preserving the signature property of normalizing flows, that they admit exact density evaluation. We learn a suitable atlas of charts for the data manifold via a vector quantized auto-encoder (VQ-AE) and the distributions over them using a conditional flow. We validate experimentally that our probabilistic framework enables existing approaches to better model data distributions over complex manifolds. In Chapter 3 we examine higher order Fourier transforms in both discrete and continuous contexts. We demonstrate a connection to a matrix time variant of the free Schr\"{o}dinger equation, as well as a potential application to magnetic resonance imaging. In the discrete case we show that the reconstruction properties of higher order Fourier frames are intricately related to quadratic Gauss sums.Item HOMOTOPY CONTINUATION METHODS FOR PHASE RETRIEVAL(2021) Bekkerman, David; Balan, Radu; Applied Mathematics and Scientific Computation; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)In this dissertation, we discuss the problem of recovering a signal from a set of phaseless measurements. This type of problem shows up in numerous applications and is known for its numerical difficulty. It finds use in X-ray Crystallography, Microscopy, Quantum Information, and many others. We formulate the problem using a non-convex quadratic loss function whose global minimum recovers the phase of the measurement.Our approach to this problem is via a Homotopy Continuation Method. These methods have found great use in solving systems of nonlinear equations in numer- ical algebraic geometry. The idea is to initialize the solution of a related system at a known global optimal, then continuously deform the criterion and follow the solution path until we find the minimum of the desired loss function. We analyze convergence properties and asymptotic results for these algorithms, as well as gather some numerical statistics. The main contribution of this thesis is deriving conditions for convergence of the algorithm and an asymptotic rate for when these conditions are satisfied. We also show that the algorithm achieves good numerical accuracy. The dissertation is split into several chapters, and further divided by the real and complex case. Chapter 1 gives some background to Abstract Phase Retrieval and Homotopy Continuation Methods. Chapter 2 covers the nature of the algorithm (named the Golden Retriever), gives a summary and description of the theoretical results, and shows some numerical results. Chapter 3 covers the details of the derivation and results in the real case, and Chapter 4 covers the same for the complex case.Item Nonlinear Analysis of Phase Retrieval and Deep Learning(2017) Zou, Dongmian; Balan, Radu V; Applied Mathematics and Scientific Computation; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Nonlinearity causes information loss. The phase retrieval problem, or the phaseless reconstruction problem, seeks to reconstruct a signal from the magnitudes of linear measurements. With a more complicated design, convolutional neural networks use nonlinearity to extract useful features. We can model both problems in a frame-theoretic setting. With the existence of a noise, it is important to study the stability of the phaseless reconstruction and the feature extraction part of the convolutional neural networks. We prove the Lipschitz properties in both cases. In the phaseless reconstruction problem, we show that phase retrievability implies a bi-Lipschitz reconstruction map, which can be extended to the Euclidean space to accommodate noises while remaining to be stable. In the deep learning problem, we set up a general framework for the convolutional neural networks and provide an approach for computing the Lipschitz constants.