Risk-Sensitive Control and Dynamic Games for Partially Observed Discrete - Time Nonlinear Systems

Thumbnail Image
Files
TR_92-124.pdf(1003.12 KB)
No. of downloads: 1444
Publication or External Link
Date
1992
Authors
James, Matthew R.
Baras, John S.
Elliott, Robert J.
Advisor
Citation
DRUM DOI
Abstract
In this paper we solve a finite-horizon partially observed risk- sensitive stochastic optimal control problem for discrete-time nonlinear systems, and obtain small noise and small risk limits. The small noise limit is interpreted as a deterministic partially observed dynamic game, and new insights into the optimal solution of such game problems are obtained. Both the risk-sensitive stochastic control problem and the deterministic dynamic game problem are solved using information states, dynamic programming, and associated separated policies. A certainty equivalence principle is also discussed. Our results have implications for the nonlinear robust stabilization problem. The small risk limits is a standard partially observed risk neutral stochastic optimal control problem.
Notes
Rights