Skip to Main Content
The ability to ensure reliable adaptation is important in safety-critical applications. Traditional software verification and validation techniques cannot account for the time-evolving nature of a system, making them inapplicable for adaptive computing system assurance. In this paper, we propose considering stability of adaptation as a heuristic measure of reliability. We present a stability monitoring technique that detects unstable learning behavior during online operation of adaptive systems. The stability monitoring relies upon Lyapunov-like functions that detect distinct states in learning that bifurcate away from stable behavior. Dempster-Shafer theory is used for combining stability estimates provided by the monitors into an easily interpretable stability belief function. The proposed analysis technique is evaluated using online learning experiments based on data generated by an actual adaptive flight control system. Results indicate that the stability monitoring successfully detects unstable learning conditions. Our approach is one of the first that can be used for the verification, validation and monitoring of adaptive computing applications.