Abstract:
Majority of the text modeling techniques yield only point-estimates of document embeddings and lack in capturing the uncertainty of the estimates. These uncertainties giv...Show MoreMetadata
Abstract:
Majority of the text modeling techniques yield only point-estimates of document embeddings and lack in capturing the uncertainty of the estimates. These uncertainties give a notion of how well the embeddings represent a document. We present Bayesian subspace multinomial model (Bayesian SMM), a generative log-linear model that learns to represent documents in the form of Gaussian distributions, thereby encoding the uncertainty in its covariance. Additionally, in the proposed Bayesian SMM, we address a commonly encountered problem of intractability that appears during variational inference in mixed-logit models. We also present a generative Gaussian linear classifier for topic identification that exploits the uncertainty in document embeddings. Our intrinsic evaluation using perplexity measure shows that the proposed Bayesian SMM fits the unseen test data better as compared to the state-of-the-art neural variational document model on (Fisher) speech and (20Newsgroups) text corpora. Our topic identification experiments show that the proposed systems are robust to over-fitting on unseen test data. The topic ID results show that the proposed model outperforms state-of-the-art unsupervised topic models and achieve comparable results to the state-of-the-art fully supervised discriminative models.
Published in: IEEE/ACM Transactions on Audio, Speech, and Language Processing ( Volume: 28)
Funding Agency:
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Document Embedding ,
- Test Data ,
- Bayesian Model ,
- Log-linear ,
- Form Of Distribution ,
- Model Discrimination ,
- Topic Modeling ,
- Linear Classifier ,
- Subject ID ,
- Unsupervised Model ,
- Variational Inference ,
- Mixed Logit Model ,
- Speech Corpus ,
- Training Data ,
- Posterior Probability ,
- Latent Variables ,
- Row Vector ,
- Weight Decay ,
- Mean-field ,
- Language Model ,
- Latent Dirichlet Allocation ,
- Evidence Lower Bound ,
- Term Frequency-inverse Document Frequency ,
- Likelihood Of The Data ,
- True Posterior ,
- Precision Matrix ,
- Reparameterization Trick ,
- Training Examples ,
- Words In Sentences ,
- Variational Autoencoder
- Author Keywords
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Document Embedding ,
- Test Data ,
- Bayesian Model ,
- Log-linear ,
- Form Of Distribution ,
- Model Discrimination ,
- Topic Modeling ,
- Linear Classifier ,
- Subject ID ,
- Unsupervised Model ,
- Variational Inference ,
- Mixed Logit Model ,
- Speech Corpus ,
- Training Data ,
- Posterior Probability ,
- Latent Variables ,
- Row Vector ,
- Weight Decay ,
- Mean-field ,
- Language Model ,
- Latent Dirichlet Allocation ,
- Evidence Lower Bound ,
- Term Frequency-inverse Document Frequency ,
- Likelihood Of The Data ,
- True Posterior ,
- Precision Matrix ,
- Reparameterization Trick ,
- Training Examples ,
- Words In Sentences ,
- Variational Autoencoder
- Author Keywords