ESSAYS IN STATISTICAL ANALYSIS: ISOTONIC REGRESSION AND FILTERING

dc.contributor.advisorRyzhov, Ilya Oen_US
dc.contributor.advisorSmith, Paul Jen_US
dc.contributor.authorXue, Jinhangen_US
dc.contributor.departmentMathematical Statisticsen_US
dc.contributor.publisherDigital Repository at the University of Marylanden_US
dc.contributor.publisherUniversity of Maryland (College Park, Md.)en_US
dc.date.accessioned2019-01-31T06:32:08Z
dc.date.available2019-01-31T06:32:08Z
dc.date.issued2018en_US
dc.description.abstractIn many real-world applications in optimal information collection and stochastic approximation, statistical estimators are often constructed to learn the true parameter value of some utility functions or underlying signals. Many of these estimators exhibit excellent empirical performance, but full analyses of their consistency are not previously available, thus putting decision-makers in somewhat of a predicament regarding implementation. The goal of this dissertation is to fill this blank of missing consistency proofs. The first part of this thesis considers the consistency of estimating a monotonic cost function which appears in an optimal learning algorithm that incorporates isotonic regression with a Bayesian policy known as Knowledge Gradient with Discrete Priors (KGDP). Isotonic regression deals with regression problems under order constraints. Previous literature proposed to estimate the cost function by a weighted sum of a pool of candidate curves, each of which is generated by the isotonic regression estimator based on all the previous observations that have been collected, and the weights are calculated by KGDP. Our primary objective is to establish the consistency of the suggested estimator. Some minor results, regarding with the knowledge gradient algorithm and the isotonic regression estimator under insufficient observations, are also discussed. The second part of this thesis focuses on the convergence of the bias-adjusted Kalman filter (BAKF). The BAKF algorithm is designed to optimize the statistical estimation of a non-stationary signal that can only be observed with stochastic noise. The algorithm has numerous applications in dynamic programming and signal processing. However, a consistency analysis of the process that approximates the underlying signal has heretofore not been available. We resolve this open issue by showing that the BAKF stepsize satisfies the well-known conditions on almost sure convergence of a stochastic approximation sequence, with only one additional assumption on the convergence rate of the signal compared to those used in the derivation of the original problem.en_US
dc.identifierhttps://doi.org/10.13016/0r56-0ndt
dc.identifier.urihttp://hdl.handle.net/1903/21582
dc.language.isoenen_US
dc.subject.pqcontrolledStatisticsen_US
dc.subject.pquncontrolledConsistencyen_US
dc.subject.pquncontrolledFilteringen_US
dc.subject.pquncontrolledIsotonic Regressionen_US
dc.titleESSAYS IN STATISTICAL ANALYSIS: ISOTONIC REGRESSION AND FILTERINGen_US
dc.typeDissertationen_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Xue_umd_0117E_19483.pdf
Size:
1.24 MB
Format:
Adobe Portable Document Format