The Partially Observed Stochastic Minimum Principle.

Loading...
Thumbnail Image

Files

TR_88-2.pdf (463.46 KB)
No. of downloads: 726

Publication or External Link

Date

1988

Advisor

Citation

DRUM DOI

Abstract

Various proofs have been given of the minimum principle satisfied by an optimal control in a partially observed stochastic control problem. See, for example, the papers by Bensoussan [1], Elliott [5], Haussmann [7], and the recent paper [9] by Haussmann in which the adjoint process is identified. The simple case of a partially observed Markov chain is discussed in the University of Maryland lecture notes [6] of the second author. We show in this article how a minimum principle for a partially observed diffusion can be obtained by differentiating the statement that a control U* is optimal. The results of Bismuit [2], [3] and Kunita [10], on stochastic flows enable us to compute in an easy and explicit way the change in the cost due to a 'strong variation' of an optimal control. The only technical difficulty is the justification of the differentiation. As we wished to exhibit the simplification obtained by using the ideas of stochastic flows the result is not proved under the weakest possible hypotheses. Finally, in Section 6, we show how Bensoussan's minimum principle follows from our result if the drift coefficient is differentiable in the control variable.

Notes

Rights