Institute for Systems Research Technical Reports
Permanent URI for this collectionhttp://hdl.handle.net/1903/4376
This archive contains a collection of reports generated by the faculty and students of the Institute for Systems Research (ISR), a permanent, interdisciplinary research unit in the A. James Clark School of Engineering at the University of Maryland. ISR-based projects are conducted through partnerships with industry and government, bringing together faculty and students from multiple academic departments and colleges across the university.
Browse
4 results
Search Results
Item Output Feedback Risk - Sensitive Control and Differential Games for Continuous - Time Nonlinear Systems(1993) James, Matthew R.; Baras, John S.; Elliott, Robert J.; ISRIn this paper we carry out a formal analysis of an output feedback risk-sensitive stochastic control problem. Using large deviation limits, this problem is related to a deterministic output feedback differential game. Both problems are solved using appropriate information states. The use of an information state for the game problem is new, and is the principal contribution of our work. Our results have implications for the nonlinear robust stabilization problem.Item Risk-Sensitive Control and Dynamic Games for Partially Observed Discrete - Time Nonlinear Systems(1992) James, Matthew R.; Baras, John S.; Elliott, Robert J.; ISRIn this paper we solve a finite-horizon partially observed risk- sensitive stochastic optimal control problem for discrete-time nonlinear systems, and obtain small noise and small risk limits. The small noise limit is interpreted as a deterministic partially observed dynamic game, and new insights into the optimal solution of such game problems are obtained. Both the risk-sensitive stochastic control problem and the deterministic dynamic game problem are solved using information states, dynamic programming, and associated separated policies. A certainty equivalence principle is also discussed. Our results have implications for the nonlinear robust stabilization problem. The small risk limits is a standard partially observed risk neutral stochastic optimal control problem.Item The Partially Observed Stochastic Minimum Principle.(1988) Baras, John S.; Elliott, Robert J.; Kohlmann, Michael; ISRVarious proofs have been given of the minimum principle satisfied by an optimal control in a partially observed stochastic control problem. See, for example, the papers by Bensoussan [1], Elliott [5], Haussmann [7], and the recent paper [9] by Haussmann in which the adjoint process is identified. The simple case of a partially observed Markov chain is discussed in the University of Maryland lecture notes [6] of the second author. We show in this article how a minimum principle for a partially observed diffusion can be obtained by differentiating the statement that a control U* is optimal. The results of Bismuit [2], [3] and Kunita [10], on stochastic flows enable us to compute in an easy and explicit way the change in the cost due to a 'strong variation' of an optimal control. The only technical difficulty is the justification of the differentiation. As we wished to exhibit the simplification obtained by using the ideas of stochastic flows the result is not proved under the weakest possible hypotheses. Finally, in Section 6, we show how Bensoussan's minimum principle follows from our result if the drift coefficient is differentiable in the control variable.Item The Conditional Adjoint Process.(1988) Baras, John S.; Elliott, Robert J.; Kohlmann, Michael; ISRThe adjoint and minimum principle for a partially observed diffusion can be obtained by differentiating the statement that a control u* is optimal. Using stochastic flows the variation in the cost resulting from a change in an optimal control can be computed explicitly. The technical difficulty is to justify the differentiation.