A Compact Encoding for Efficient Character-level Deep Text Classification | IEEE Conference Publication | IEEE Xplore

A Compact Encoding for Efficient Character-level Deep Text Classification


Abstract:

This paper puts forward a new text to tensor representation that relies on information compression techniques to assign shorter codes to the most frequently used characte...Show More

Abstract:

This paper puts forward a new text to tensor representation that relies on information compression techniques to assign shorter codes to the most frequently used characters. This representation is language-independent with no need of pretraining and produces an encoding with no information loss. It provides an adequate description of the morphology of text, as it is able to represent prefixes, declensions, and inflections with similar vectors and are able to represent even unseen words on the training dataset. Similarly, as it is compact yet sparse, is ideal for speed up training times using tensor processing libraries. As part of this paper, we show that this technique is especially effective when coupled with convolutional neural networks (CNNs) for text classification at character-level. We apply two variants of CNN coupled with it. Experimental results show that it drastically reduces the number of parameters to be optimized, resulting in competitive classification accuracy values in only a fraction of the time spent by one-hot encoding representations, thus enabling training in commodity hardware.
Date of Conference: 08-13 July 2018
Date Added to IEEE Xplore: 14 October 2018
ISBN Information:
Electronic ISSN: 2161-4407
Conference Location: Rio de Janeiro, Brazil

Contact IEEE to Subscribe

References

References is not available for this document.