Skip to Main Content
We discuss algorithms for combining sequential prediction strategies, a task which can be viewed as a natural generalization of the concept of universal coding. We describe a graphical language based on hidden Markov models for defining prediction strategies, and we provide both existing and new models as examples. The models include efficient, parameterless models for switching between the input strategies over time, including a model for the case where switches tend to occur in clusters, and finally a new model for the scenario where the prediction strategies have a known relationship, and where jumps are typically between strongly related ones. This last model is relevant for coding time series data where parameter drift is expected. As theoretical contributions, we introduce an interpolation construction that is useful in the development and analysis of new algorithms, and we establish a new sophisticated lemma for analyzing the individual sequence regret of parameterized models.