Partially Observed Differential Games, Infinite Dimensional HJI Equations, and Nonlinear HControl
Files
Publication or External Link
Date
Authors
Advisor
Citation
DRUM DOI
Abstract
This paper presents new results for partially observed nonlinear differential games, which are applied to the nonlinear output feedback Hrobust control problem. Using the concept of information state, we solve these problems in terms of an infinite dimensional partial differential equation, viz., the Hamilton-Jacobi-Isaacs equation for partial observed differential games. We give definitions of smooth and viscosity solutions, and prove that the value function is a viscosity solution of the HJI equation. We prove a verification theorem, which implies that the optimal controls are separated in that they depend on the observations through the information state. This constitutes a separation principle for partially observed differential games. We also present some new results concerning the certainty equivalence principle.