Loading [MathJax]/extensions/MathMenu.js
Repr2Seq: A Data-to-Text Generation Model for Time Series | IEEE Conference Publication | IEEE Xplore

Repr2Seq: A Data-to-Text Generation Model for Time Series


Abstract:

Data-to-text generation takes structured data as input and produces text that sufficiently describes the data as output. Recently, it has received a lot of attention from...Show More

Abstract:

Data-to-text generation takes structured data as input and produces text that sufficiently describes the data as output. Recently, it has received a lot of attention from both research field and industry. However, as a critical data form, time series is less discussed in this domain. This paper proposes Repr2Seq, a data-to-text generation model for time series. To better capture the structure and core information of time series, Repr2Seq obtains representation vectors using time series representation learning methods, which are then fed into a neural network-based model to generate text sequences. To demonstrate the effectiveness of Repr2Seq, a dataset consisting of stock price series and corresponding comments is proposed. Experiments show that Repr2Seq achieves significant improvement over standard approaches and leads to satisfactory results. We also conduct experiments to investigate the effect of hyperparameters on the model and detect performance improvements in various settings.
Date of Conference: 18-23 June 2023
Date Added to IEEE Xplore: 02 August 2023
ISBN Information:

ISSN Information:

Conference Location: Gold Coast, Australia

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.