The Partially Observed Stochastic Minimum Principle.

dc.contributor.authorBaras, John S.en_US
dc.contributor.authorElliott, Robert J.en_US
dc.contributor.authorKohlmann, Michaelen_US
dc.contributor.departmentISRen_US
dc.date.accessioned2007-05-23T09:40:38Z
dc.date.available2007-05-23T09:40:38Z
dc.date.issued1988en_US
dc.description.abstractVarious proofs have been given of the minimum principle satisfied by an optimal control in a partially observed stochastic control problem. See, for example, the papers by Bensoussan [1], Elliott [5], Haussmann [7], and the recent paper [9] by Haussmann in which the adjoint process is identified. The simple case of a partially observed Markov chain is discussed in the University of Maryland lecture notes [6] of the second author. We show in this article how a minimum principle for a partially observed diffusion can be obtained by differentiating the statement that a control U* is optimal. The results of Bismuit [2], [3] and Kunita [10], on stochastic flows enable us to compute in an easy and explicit way the change in the cost due to a 'strong variation' of an optimal control. The only technical difficulty is the justification of the differentiation. As we wished to exhibit the simplification obtained by using the ideas of stochastic flows the result is not proved under the weakest possible hypotheses. Finally, in Section 6, we show how Bensoussan's minimum principle follows from our result if the drift coefficient is differentiable in the control variable.en_US
dc.format.extent474585 bytes
dc.format.mimetypeapplication/pdf
dc.identifier.urihttp://hdl.handle.net/1903/4735
dc.language.isoen_USen_US
dc.relation.ispartofseriesISR; TR 1988-2en_US
dc.titleThe Partially Observed Stochastic Minimum Principle.en_US
dc.typeTechnical Reporten_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
TR_88-2.pdf
Size:
463.46 KB
Format:
Adobe Portable Document Format