E-VFIA: Event-Based Video Frame Interpolation with Attention | IEEE Conference Publication | IEEE Xplore

E-VFIA: Event-Based Video Frame Interpolation with Attention


Abstract:

Video frame interpolation (VFI) is a fundamental vision task that aims to synthesize several frames between two consecutive original video images. Most algorithms aim to ...Show More

Abstract:

Video frame interpolation (VFI) is a fundamental vision task that aims to synthesize several frames between two consecutive original video images. Most algorithms aim to accomplish VFI by using only keyframes, which is an ill-posed problem since the keyframes usually do not yield any accurate precision about the trajectories of the objects in the scene. On the other hand, event-based cameras provide more precise information between the keyframes of a video. Some recent state-of-the-art event-based methods approach this problem by utilizing event data for better optical flow estimation to interpolate for video frame by warping. Nonetheless, those methods heavily suffer from the ghosting effect. On the other hand, some of kernel-based VFI methods that only use frames as input, have shown that deformable convolutions, when backed up with transformers, can be a reliable way of dealing with long-range dependencies. We propose event-based video frame interpolation with attention (E-VFIA), as a lightweight kernelbased method. E-VFIA fuses event information with standard video frames by deformable convolutions to generate high quality interpolated frames. The proposed method represents events with high temporal resolution and uses a multi-head selfattention mechanism to better encode event-based information, while being less vulnerable to blurring and ghosting artifacts; thus, generating crispier frames. The simulation results show that the proposed technique outperforms current state-of-the-art methods (both frame and event-based) with a significantly smaller model size. Multimedia material: The code is available at https://github.com/ahmetakman/E-VFIA
Date of Conference: 29 May 2023 - 02 June 2023
Date Added to IEEE Xplore: 04 July 2023
ISBN Information:
Conference Location: London, United Kingdom

I. Introduction

In robotics applications, where fast- moving agents are involved, high frame-rate video streams are necessary for agile reaction of control systems. There has been dedicated hardware for capturing high frame-rate videos; however, they are quite expensive. Therefore, increasing the frame rate with the involvement of additional processing yields more affordable high frame-rate videos and smoother slow-motion videos.

Contact IEEE to Subscribe

References

References is not available for this document.