Loading [MathJax]/extensions/MathMenu.js
Combining Domain Knowledge Extraction With Graph Long Short-Term Memory for Learning Classification of Chinese Legal Documents | IEEE Journals & Magazine | IEEE Xplore

Combining Domain Knowledge Extraction With Graph Long Short-Term Memory for Learning Classification of Chinese Legal Documents

Open Access

The graph briefly introduces the approach of classification for Chinese legal documents by combining Graph LSTM with domain knowledge extraction. The work can be divided ...

Abstract:

It is of great importance for procedure retrieval to find an effective classification method of Chinese legal documents with deep semantic understanding, as the electroni...Show More

Abstract:

It is of great importance for procedure retrieval to find an effective classification method of Chinese legal documents with deep semantic understanding, as the electronic documents of Chinese law have massive volume and complex structure. In this paper, a method for learning Chinese legal document classification using Graph LSTM (Long Short-Term Memory) combined with domain knowledge extraction is proposed. First, the judicial domain model is constructed based on ontologies that include top-level ontology and domain-specific ontology. Second, the legal documents are divided into different knowledge blocks through top-level ontology and domain-specific ontology. Third, information is extracted from the knowledge blocks according to the legal domain model and stored in XML files. At last, Graph LSTM is applied for classification. The experiments show that compared with the traditional classification methods of support vector machine (SVM) and LSTM, Graph LSTM has higher classification accuracy and better classification performance.
The graph briefly introduces the approach of classification for Chinese legal documents by combining Graph LSTM with domain knowledge extraction. The work can be divided ...
Published in: IEEE Access ( Volume: 7)
Page(s): 139616 - 139627
Date of Publication: 25 September 2019
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.