Loading [a11y]/accessibility-menu.js
Multi-Label Neural Decoders for Block Codes | IEEE Conference Publication | IEEE Xplore

Multi-Label Neural Decoders for Block Codes


Abstract:

The problem of decoding an (n, k, d) error-correcting block code, where a k-bit message word is mapped to an n-bit codeword, can be cast as a single-label classification ...Show More

Abstract:

The problem of decoding an (n, k, d) error-correcting block code, where a k-bit message word is mapped to an n-bit codeword, can be cast as a single-label classification problem. While it has been observed that the performance of such single-label neural decoders closely approaches that of the corresponding maximum likelihood decoder (MLD), the number of output nodes increases exponentially with k, making it prohibitive to implement for large k. To address this issue, we explore classification based multi-label neural decoders, in which the number of output nodes increases linearly with k. We consider well-known linear and non-linear block codes, as well as concatenated block codes, which have applications in emerging wireless networks. Our study finds that (i) although the number of output nodes linearly increases with k in a multi-label decoder, it requires more hidden layers and nodes in each hidden layer than the corresponding single-label decoder to achieve its best performance, and (ii) although one can design a multi-label decoder with bit error rate matching that of the MLD, it leaves more blocks in error leading to a reduced performance in terms of block error rate. We also note that the performance of the proposed decoder for concatenated codes is at least as good as that of a natural decoding algorithm in which the inner code is first decoded using the MLD and then the outer code is decoded with a polynomial-time decoding algorithm.
Date of Conference: 07-11 June 2020
Date Added to IEEE Xplore: 27 July 2020
ISBN Information:

ISSN Information:

Conference Location: Dublin, Ireland

Contact IEEE to Subscribe

References

References is not available for this document.