Skip to Main Content
The automatic sign language translation still is the most complex and challenging task for video recognition and processing. This work presents the Brazilian Sign Language Automatic Translation project and specifically focuses on low complexity Artificial Neural Networks dedicated to real-time video processing. A new approach for reducing the computational complexity of the activation function of the Multi-Layer Perceptron is proposed in this work, allowing complex processing of video signals be done in real-time. The low complexity neural networks are used in two stages of the system. In the color detection and hand posture classification blocks. The obtained results indicate an increase of the frame rate from 8.6 fps to 28.1 fps using a personal microcomputer with a USB webcam, without reduction of the correct recognition rate.