Loading [MathJax]/extensions/MathMenu.js
E-TRGAN: A Novel Transformer Generative Adversarial Network for High-Density Surface Electromyography Signal Reconstruction | IEEE Journals & Magazine | IEEE Xplore

E-TRGAN: A Novel Transformer Generative Adversarial Network for High-Density Surface Electromyography Signal Reconstruction


Abstract:

This study presents a novel method to tackle the difficulties in high-density surface electromyography (HD-sEMG) signal reconstruction: the electromyographic transformer ...Show More

Abstract:

This study presents a novel method to tackle the difficulties in high-density surface electromyography (HD-sEMG) signal reconstruction: the electromyographic transformer generative adversarial network (E-TRGAN). Conventional techniques struggle with the intricate spatial-temporal dynamics of HD-sEMG and mostly concentrate on bipolar surface electromyography (sEMG). E-TRGAN performs exceptionally well at reconstructing HD-sEMG signals while preserving their temporal and spatial integrity by utilizing the transformer model and the structure of generative adversarial networks (GANs). We present a thorough evaluation that includes comparisons with convolutional neural network (CNN)-based methods and linear interpolation, showing that E-TRGAN performs better at recovering HD-sEMG signals in a variety of corruption conditions, such as reduced channels, random channel loss, and severe signal value loss. Metrics like mean squared error (mse), mean absolute error (MAE), root mse (RMSE), peak signal-to-noise ratio (PSNR), and structural similarity index (SSIM) show that the model regularly performs better than competing approaches. Although E-TRGAN’s accuracy decreases in situations when there is simultaneous spatial and temporal signal loss, its overall efficacy represents a major breakthrough in HD-sEMG signal processing. This study highlights E-TRGAN’s promise as an adaptable approach for intricate HD-sEMG signal restoration in a range of applications.
Article Sequence Number: 4011013
Date of Publication: 03 October 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.