Loading [MathJax]/extensions/MathMenu.js
Hierarchical Speaker-Aware Sequence-to-Sequence Model for Dialogue Summarization | IEEE Conference Publication | IEEE Xplore

Hierarchical Speaker-Aware Sequence-to-Sequence Model for Dialogue Summarization


Abstract:

Traditional document summarization models cannot handle dialogue summarization tasks perfectly. In situations with multiple speakers and complex personal pronouns referen...Show More

Abstract:

Traditional document summarization models cannot handle dialogue summarization tasks perfectly. In situations with multiple speakers and complex personal pronouns referential relationships in the conversation. The predicted summaries of these models are always full of personal pronoun confusion. In this paper, we propose a hierarchical transformer-based model for dialogue summarization. It encodes dialogues from words to utterances and distinguishes the relationships between speakers and their corresponding personal pronouns clearly. In such a from-coarse-to-fine procedure, our model can generate summaries more accurately and relieve the confusion of personal pronouns. Experiments are based on a dialogue summarization dataset SAMsum, and the results show that the proposed model achieved a comparable result against other strong baselines. Empirical experiments have shown that our method can relieve the confusion of personal pronouns in predicted summaries.
Date of Conference: 06-11 June 2021
Date Added to IEEE Xplore: 13 May 2021
ISBN Information:

ISSN Information:

Conference Location: Toronto, ON, Canada

1. INTRODUCTION

The natural language processing module in the middle of speech processing modules plays an important role in an intelligent customer service system. While processing dialogues between users and user-agents, it is time-consuming and labor-consuming for people to review thousands of utterances and rack their brains to summarize all these dialogues. Therefore, a well-performed system, that summarizes dialogue accurately and fluently, has aroused broad interest.

Contact IEEE to Subscribe

References

References is not available for this document.