Partially Observed Differential Games, Infinite Dimensional HJI Equations, and Nonlinear HControl

Loading...
Thumbnail Image

Files

TR_94-49.pdf (1.12 MB)
No. of downloads: 562

Publication or External Link

Date

1994

Advisor

Citation

DRUM DOI

Abstract

This paper presents new results for partially observed nonlinear differential games, which are applied to the nonlinear output feedback Hrobust control problem. Using the concept of information state, we solve these problems in terms of an infinite dimensional partial differential equation, viz., the Hamilton-Jacobi-Isaacs equation for partial observed differential games. We give definitions of smooth and viscosity solutions, and prove that the value function is a viscosity solution of the HJI equation. We prove a verification theorem, which implies that the optimal controls are separated in that they depend on the observations through the information state. This constitutes a separation principle for partially observed differential games. We also present some new results concerning the certainty equivalence principle.

Notes

Rights