Loading [MathJax]/extensions/MathMenu.js
Exploiting temporal information to prevent the transferability of adversarial examples against deep fake detectors | IEEE Conference Publication | IEEE Xplore

Exploiting temporal information to prevent the transferability of adversarial examples against deep fake detectors


Abstract:

The diffusion of AI tools capable of generating realistic DeepFakes (DF) videos raises serious threats to face-based biometric recognition systems. For this reason, sever...Show More

Abstract:

The diffusion of AI tools capable of generating realistic DeepFakes (DF) videos raises serious threats to face-based biometric recognition systems. For this reason, several detectors based on Deep Neural Networks (DNNs) have been developed to distinguish between real and DF videos. Despite their good performance, these methods suffer from vulnerability to adversarial attacks. In this paper, we argue that it is possible to increase the resilience of DNN-based DF detectors against black-box adversarial attacks by exploiting the temporal information contained in the video. By using such information, in fact, the transferability of adversarial examples from a source to a target model is significantly decreased, making it difficult to launch an attack without accessing the target network. To back this claim, we trained two convolutional neural networks (CNNs) to detect DF videos, and measured their robustness against black-box, transfer-based, attacks. We also trained two detectors by adding to the CNNs a long short-term memory (LSTM) layer to extract temporal information. Then, we measured the transferability of adversarial examples to-wards the LSTM-networks. The results we got suggest that the methods based on temporal information are less prone to black-box attacks.
Date of Conference: 10-13 October 2022
Date Added to IEEE Xplore: 17 January 2023
ISBN Information:

ISSN Information:

Conference Location: Abu Dhabi, United Arab Emirates

Contact IEEE to Subscribe

References

References is not available for this document.