Loading [MathJax]/extensions/MathMenu.js
Full-Text or Abstract? Examining Topic Coherence Scores Using Latent Dirichlet Allocation | IEEE Conference Publication | IEEE Xplore

Full-Text or Abstract? Examining Topic Coherence Scores Using Latent Dirichlet Allocation


Abstract:

This paper assesses topic coherence and human topic ranking of uncovered latent topics from scientific publications when utilizing the topic model latent Dirichlet alloca...Show More

Abstract:

This paper assesses topic coherence and human topic ranking of uncovered latent topics from scientific publications when utilizing the topic model latent Dirichlet allocation (LDA) on abstract and full-text data. The coherence of a topic, used as a proxy for topic quality, is based on the distributional hypothesis that states that words with similar meaning tend to co-occur within a similar context. Although LDA has gained much attention from machine-learning researchers, most notably with its adaptations and extensions, little is known about the effects of different types of textual data on generated topics. Our research is the first to explore these practical effects and shows that document frequency, document word length, and vocabulary size have mixed practical effects on topic coherence and human topic ranking of LDA topics. We furthermore show that large document collections are less affected by incorrect or noise terms being part of the topic-word distributions, causing topics to be more coherent and ranked higher. Differences between abstract and full-text data are more apparent within small document collections, with differences as large as 90% high-quality topics for full-text data, compared to 50% high-quality topics for abstract data.
Date of Conference: 19-21 October 2017
Date Added to IEEE Xplore: 18 January 2018
ISBN Information:
Conference Location: Tokyo, Japan

Contact IEEE to Subscribe

References

References is not available for this document.