Skip to Main Content
Three parallel gaps in robust feedback control theory are examined: sufficiency versus necessity, deterministic versus stochastic uncertainty modeling, and stability versus performance. Deterministic and stochastic output-feedback control problems are considered with both static and dynamic controllers. The static and dynamic robust stabilization problems involve deterministically modeled bounded but unknown measurable time-varying parameter variations, while the static and dynamic stochastic optimal control problems feature state-, control-, and measurement-dependent white noise. General sufficiency conditions for the deterministic problems are obtained using Lyapunov's direct method, while necessary conditions for the stochastic problems are derived as a consequence of minimizing a quadratic performance criterion. The sufficiency tests are then applied to the necessary conditions to determine when solutions of the stochastic optimization problems also solve the deterministic robust stability problems. As an additional application of the deterministic result, the modified Riccati equation approach of Petersen and Hollot is generalized in the static case and extended to dynamic compensation.