Show simple item record

dc.contributor.advisorFu, Michael C.en_US
dc.contributor.authorChau, Marieen_US
dc.date.accessioned2015-06-27T05:33:13Z
dc.date.available2015-06-27T05:33:13Z
dc.date.issued2015en_US
dc.identifierhttps://doi.org/10.13016/M2P35S
dc.identifier.urihttp://hdl.handle.net/1903/16707
dc.description.abstractIn this dissertation, we propose two new types of stochastic approximation (SA) methods and study the sensitivity of SA and of a stochastic gradient method to various input parameters. First, we summarize the most common stochastic gradient estimation techniques, both direct and indirect, as well as the two classical SA algorithms, Robbins-Monro (RM) and Kiefer-Wolfowitz (KW), followed by some well-known modifications to the step size, output, gradient, and projection operator. Second, we introduce two new stochastic gradient methods in SA for univariate and multivariate stochastic optimization problems. Under a setting where both direct and indirect gradients are available, our new SA algorithms estimate the gradient using a hybrid estimator, which is a convex combination of a symmetric finite difference-type gradient estimate and an average of two associated direct gradient estimates. We derive variance minimizing weights that lead to desirable theoretical properties and prove convergence of the SA algorithms. Next, we study the finite-time performance of the KW algorithm and its sensitivity to the step size parameter, along with two of its adaptive variants, namely Kesten's rule and scale-and-shifted KW (SSKW). We conduct a sensitivity analysis of KW and explore the tightness of an mean-squared error (MSE) bound for quadratic functions, a relevant issue for determining how long to run an SA algorithm. Then, we propose two new adaptive step size sequences inspired by both Kesten's rule and SSKW, which address some of their weaknesses. Instead of us- ing one step size sequence, our adaptive step size is based on two deterministic sequences, and the step size used in the current iteration depends on the perceived proximity of the current iterate to the optimum. In addition, we introduce a method to adaptively adjust the two deterministic sequences. Lastly, we investigate the performance of a modified pathwise gradient estimation method that is applied to financial options with discontinuous payoffs, and in particular, used to estimate the Greeks, which measure the rate of change of (financial) derivative prices with respect to underlying market parameters and are central to financial risk management. The newly proposed kernel estimator relies on a smoothing bandwidth parameter. We explore the accuracy of the Greeks with varying bandwidths and investigate the sensitivity of a proposed iterative scheme that generates an estimate of the optimal bandwidth.en_US
dc.language.isoenen_US
dc.titleStochastic Simulation: New Stochastic Approximation Methods and Sensitivity Analysesen_US
dc.typeDissertationen_US
dc.contributor.publisherDigital Repository at the University of Marylanden_US
dc.contributor.publisherUniversity of Maryland (College Park, Md.)en_US
dc.contributor.departmentApplied Mathematics and Scientific Computationen_US
dc.subject.pqcontrolledOperations researchen_US
dc.subject.pqcontrolledApplied mathematicsen_US
dc.subject.pquncontrolledMonte Carlo Simulationen_US
dc.subject.pquncontrolledSensitivity Analysisen_US
dc.subject.pquncontrolledSimulation Optimizationen_US
dc.subject.pquncontrolledStochastic Approximationen_US
dc.subject.pquncontrolledStochastic Gradient Estimationen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record