Spectral Frame Analysis and Learning through Graph Structure

dc.contributor.advisorOkoudjou, Kasso Aen_US
dc.contributor.advisorCzaja, Wojciech Ken_US
dc.contributor.authorClark, Chae Almonen_US
dc.contributor.departmentApplied Mathematics and Scientific Computationen_US
dc.contributor.publisherDigital Repository at the University of Marylanden_US
dc.contributor.publisherUniversity of Maryland (College Park, Md.)en_US
dc.date.accessioned2016-06-22T06:08:16Z
dc.date.available2016-06-22T06:08:16Z
dc.date.issued2016en_US
dc.description.abstractThis dissertation investigates the connection between spectral analysis and frame theory. When considering the spectral properties of a frame, we present a few novel results relating to the spectral decomposition. We first show that scalable frames have the property that the inner product of the scaling coefficients and the eigenvectors must equal the inverse eigenvalues. From this, we prove a similar result when an approximate scaling is obtained. We then focus on the optimization problems inherent to the scalable frames by first showing that there is an equivalence between scaling a frame and optimization problems with a non-restrictive objective function. Various objective functions are considered, and an analysis of the solution type is presented. For linear objectives, we can encourage sparse scalings, and with barrier objective functions, we force dense solutions. We further consider frames in high dimensions, and derive various solution techniques. From here, we restrict ourselves to various frame classes, to add more specificity to the results. Using frames generated from distributions allows for the placement of probabilistic bounds on scalability. For discrete distributions (Bernoulli and Rademacher), we bound the probability of encountering an ONB, and for continuous symmetric distributions (Uniform and Gaussian), we show that symmetry is retained in the transformed domain. We also prove several hyperplane-separation results. With the theory developed, we discuss graph applications of the scalability framework. We make a connection with graph conditioning, and show the in-feasibility of the problem in the general case. After a modification, we show that any complete graph can be conditioned. We then present a modification of standard PCA (robust PCA) developed by Cand\`es, and give some background into Electron Energy-Loss Spectroscopy (EELS). We design a novel scheme for the processing of EELS through robust PCA and least-squares regression, and test this scheme on biological samples. Finally, we take the idea of robust PCA and apply the technique of kernel PCA to perform robust manifold learning. We derive the problem and present an algorithm for its solution. There is also discussion of the differences with RPCA that make theoretical guarantees difficult.en_US
dc.identifierhttps://doi.org/10.13016/M2R47H
dc.identifier.urihttp://hdl.handle.net/1903/18340
dc.language.isoenen_US
dc.subject.pqcontrolledApplied mathematicsen_US
dc.subject.pquncontrolledEELSen_US
dc.subject.pquncontrolledFinite Framesen_US
dc.subject.pquncontrolledGraphsen_US
dc.subject.pquncontrolledMachine Learningen_US
dc.subject.pquncontrolledOptimizationen_US
dc.subject.pquncontrolledSpectral Analysisen_US
dc.titleSpectral Frame Analysis and Learning through Graph Structureen_US
dc.typeDissertationen_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Clark_umd_0117E_17090.pdf
Size:
3.06 MB
Format:
Adobe Portable Document Format