Loading [a11y]/accessibility-menu.js
No-Reference VMAF: A Deep Neural Network-Based Approach to Blind Video Quality Assessment | IEEE Journals & Magazine | IEEE Xplore

No-Reference VMAF: A Deep Neural Network-Based Approach to Blind Video Quality Assessment


Abstract:

As the demand for high-quality video content continues to rise, accurately assessing the visual quality of digital videos has become more crucial than ever before. Howeve...Show More

Abstract:

As the demand for high-quality video content continues to rise, accurately assessing the visual quality of digital videos has become more crucial than ever before. However, evaluating the perceptual quality of an impaired video in the absence of the original reference signal remains a significant challenge. To address this problem, we propose a novel No-Reference (NR) video quality metric called NR-VMAF. Our method is designed to replicate the popular Full-Reference (FR) metric VMAF in scenarios where the reference signal is unavailable or impractical to obtain. Like its FR counterpart, NR-VMAF is tailored specifically for measuring video quality in the presence of compression and scaling artifacts. The proposed model utilizes a deep convolutional neural network to extract quality-aware features from the pixel information of the distorted video, thereby eliminating the need for manual feature engineering. By adopting a patch-based approach, we are able to process high-resolution video data without any information loss. While the current model is trained solely on H.265/HEVC videos, its performance is verified on subjective datasets containing mainly H.264/AVC content. We demonstrate that NR-VMAF outperforms current state-of-the-art NR metrics while achieving a prediction accuracy that is comparable to VMAF and other FR metrics. Based on this strong performance, we believe that NR-VMAF is a viable approach to efficient and reliable No-Reference video quality assessment.
Published in: IEEE Transactions on Broadcasting ( Volume: 70, Issue: 3, September 2024)
Page(s): 844 - 861
Date of Publication: 19 June 2024

ISSN Information:


Contact IEEE to Subscribe

References

References is not available for this document.