Fundamental Limits for Learning Hidden Markov Model Parameters | IEEE Journals & Magazine | IEEE Xplore

Fundamental Limits for Learning Hidden Markov Model Parameters


Abstract:

We study the frontier between learnable and unlearnable hidden Markov models (HMMs). HMMs are flexible tools for clustering dependent data coming from unknown populations...Show More

Abstract:

We study the frontier between learnable and unlearnable hidden Markov models (HMMs). HMMs are flexible tools for clustering dependent data coming from unknown populations. The model parameters are known to be fully identifiable (up to label-switching) without any modelling assumption on the distributions of the populations as soon as the clusters are distinct and the hidden chain is ergodic with a full rank transition matrix. In the limit as any one of these conditions fails, it becomes impossible in general to identify parameters. For a chain with two hidden states we prove nonasymptotic minimax upper and lower bounds, matching up to constants, which exhibit thresholds at which the parameters become learnable. We also provide an upper bound on the relative entropy rate for parameters in a neighbourhood of the unlearnable region which may have interest in itself.
Published in: IEEE Transactions on Information Theory ( Volume: 69, Issue: 3, March 2023)
Page(s): 1777 - 1794
Date of Publication: 13 October 2022

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.