Abstract:
A good method of natural language generation is not letting the model generate whatever you want, but teaching the model not to generate the things you don't want. For pa...Show MoreMetadata
Abstract:
A good method of natural language generation is not letting the model generate whatever you want, but teaching the model not to generate the things you don't want. For paraphrase generation task, there should be some different words between source sentence and paraphrase sentence. In this paper, we introduce the Freedom Sentence Paraphrase Method (FSPM) that teaching the paraphrase model generate sentence without the prompt word which shouldn't appear in paraphrase sentence. This method train model by suggesting words that in the source sentence but not in the reference paraphrase sentence. Then, we can get diversified paraphrase sentences for other sentences by enumerating different word in the source sentence as the sensitive word. And there must be a best paraphrase sentence in all the results. What's more, this method don't need other conditions or complex model design. Not only the method are very brief, but also it significantly improve quality and diversity for paraphrase generation task on base models.
Published in: 2022 8th Annual International Conference on Network and Information Systems for Computers (ICNISC)
Date of Conference: 16-19 September 2022
Date Added to IEEE Xplore: 22 February 2023
ISBN Information: