Loading [a11y]/accessibility-menu.js
CMCF-Net: An End-to-End Context Multiscale Cross-Fusion Network for Robust Copy-Move Forgery Detection | IEEE Journals & Magazine | IEEE Xplore
Scheduled Maintenance: On Monday, 30 June, IEEE Xplore will undergo scheduled maintenance from 1:00-2:00 PM ET (1800-1900 UTC).
On Tuesday, 1 July, IEEE Xplore will undergo scheduled maintenance from 1:00-5:00 PM ET (1800-2200 UTC).
During these times, there may be intermittent impact on performance. We apologize for any inconvenience.

CMCF-Net: An End-to-End Context Multiscale Cross-Fusion Network for Robust Copy-Move Forgery Detection


Abstract:

Image copy-move forgery detection (CMFD) has become a challenging problem due to increasingly powerful editing software that makes forged images increasingly realistic. E...Show More

Abstract:

Image copy-move forgery detection (CMFD) has become a challenging problem due to increasingly powerful editing software that makes forged images increasingly realistic. Existing algorithms that directly connect multiple scales of features in the encoder part may not effectively aggregate contextual information, resulting in poor performance. In this paper, an end-to-end context multiscale cross-fusion network (CMCF-Net) is proposed to detect image copy-move forgery. The proposed network consists of a multiscale feature extraction fusion (MSF) module and a multi-information fusion decoding (MFD) module. Multiscale information is efficiently extracted and fused in the MSF module utilizing stacked-scale feature fusion, which improves the network's forgery localization ability on objects of different scales. The MFD module employs contextual information combination and weighted fusion of multiscale information to guide the network in obtaining relevant clues from correlated information at multiple different scales. Experimental results and analysis have demonstrated that the proposed CMCF-Net achieves the best localization results with higher robustness.
Published in: IEEE Transactions on Multimedia ( Volume: 26)
Page(s): 6090 - 6101
Date of Publication: 20 December 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.