Loading web-font TeX/Main/Regular
A Real-Time Image Row-Compression Method for High-Definition USB Cameras Based on FPGA | IEEE Journals & Magazine | IEEE Xplore

A Real-Time Image Row-Compression Method for High-Definition USB Cameras Based on FPGA


Flowchart of image compression method based on intra-row data comparison

Abstract:

A real-time image compression method based on field programmable gate array is proposed for the problem of high-frame-rate high-resolution camera image transmission under...Show More

Abstract:

A real-time image compression method based on field programmable gate array is proposed for the problem of high-frame-rate high-resolution camera image transmission under the limited bandwidth of universal serial bus (USB). This method quantizes image pixels on a per-row basis, taking advantage of the high correlation between adjacent pixels within a row, thus reducing the data volume of a single frame image transmission. The algorithm also aims to minimize the decoding complexity on the central processor of the receiving end. Corresponding hardware circuits and software programs are designed, and tests are conducted on an experimental platform. The experimental results show that this method effectively compresses image data losslessly on the board, improves the transmission frame rate. The maximum frame rate of 1280\times 1280 images tested in a USB 2.0 environment can reach 25.58 fps, an improvement of 11.67 fps compared to the original data transfer, with a compression rate of up to 55.8%. Furthermore, this method outperforms PNG decoding in terms of decoding speed, supports multi-core decoding, and achieves the highest decoding speed of 61fps when tested on the 1920\times 1080 image with 16 threads. This method provides a feasible transfer solution for real-time compressed transmission of high-speed and high-definition cameras in the industrial field.
Flowchart of image compression method based on intra-row data comparison
Published in: IEEE Access ( Volume: 12)
Page(s): 64663 - 64671
Date of Publication: 26 April 2024
Electronic ISSN: 2169-3536

References

References is not available for this document.