Skip to Main Content
Word burstiness phenomenon, which means that if a word occurs once in a document it is likely to occur repeatedly, has interested the text analysis field recently. Dirichlet Compound Multinomial Latent Dirichlet Allocation (DCMLDA) introduces this word burstiness mechanism into Latent Dirichlet Allocation (LDA). However, in DCMLDA, there is no restriction on the word burstiness intensity of each topic. Consequently, as shown in this paper, the burstiness intensities of words in major topics will become extremely low and the topics' ability to represent different semantic meanings will be impaired. In order to get topics that represent semantic meanings of documents well, we introduce constraints on topics' word burstiness intensities. Experiments demonstrate that DCMLDA with constrained word burstiness intensities achieves better performance than the original one without constraints. Besides, these additional constraints help to reveal the relationship between two key properties inherited from DCM and LDA respectively. These two properties have a great influence on the combined model's performance and their relationship revealed by this paper is an important guidance for further study of topic models.