By Topic

Topic model with constrainted word burstiness intensities

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Shaoze Lei ; State Key Laboratory of Intelligent Technology and Systems, Tsinghua National Laboratory for Information Science and Technology(TNList), Department of Automation Engineering, Tsinghua University, Beijing 100084, China ; JianWen Zhang ; Shifeng Weng ; Changshui Zhang

Word burstiness phenomenon, which means that if a word occurs once in a document it is likely to occur repeatedly, has interested the text analysis field recently. Dirichlet Compound Multinomial Latent Dirichlet Allocation (DCMLDA) introduces this word burstiness mechanism into Latent Dirichlet Allocation (LDA). However, in DCMLDA, there is no restriction on the word burstiness intensity of each topic. Consequently, as shown in this paper, the burstiness intensities of words in major topics will become extremely low and the topics' ability to represent different semantic meanings will be impaired. In order to get topics that represent semantic meanings of documents well, we introduce constraints on topics' word burstiness intensities. Experiments demonstrate that DCMLDA with constrained word burstiness intensities achieves better performance than the original one without constraints. Besides, these additional constraints help to reveal the relationship between two key properties inherited from DCM and LDA respectively. These two properties have a great influence on the combined model's performance and their relationship revealed by this paper is an important guidance for further study of topic models.

Published in:

Neural Networks (IJCNN), The 2011 International Joint Conference on

Date of Conference:

July 31 2011-Aug. 5 2011