Skip to Main Content
This article argues that future generations of computer-based systems will need cognitive user interfaces to achieve sufficiently robust and intelligent human interaction. These cognitive user interfaces will be characterized by the ability to support inference and reasoning, planning under uncertainty, short-term adaptation, and long-term learning from experience. An appropriate engineering framework for such interfaces is provided by partially observable Markov decision processes (POMDPs) that integrate Bayesian belief tracking and reward-based reinforcement learning. The benefits of this approach are demonstrated by the example of a simple gesture-driven interface to an iPhone application. Furthermore, evidence is provided that humans appear to use similar mechanisms for planning under uncertainty.