Abstract:
Conventional intention recognition for a group of sea-surface targets mainly focuses on the short-term state of motion without considering the long-term series of actions...Show MoreMetadata
Abstract:
Conventional intention recognition for a group of sea-surface targets mainly focuses on the short-term state of motion without considering the long-term series of actions and collaborative relationships, which ignores the dynamic and temporal variations in targets. In this article, we propose a cell-averaging constant false alarm rate (CA-CFAR) processor with parameter optimization based on the particle swarm optimization (PSO) algorithm to suppress sea clutter and obtain a target position. Then, we propose an interactive feature extraction network based on a graph neural network (GNN) to effectively improve the tactical intention recognition of sea-surface targets, which can be used with unmanned surface vehicles (USVs). First, the motion state sequences of the targets and USVs are encoded by a transformer using relative position information to obtain a high-dimensional feature. Second, an action difference feature extraction module (ADFEM) is designed to extract the salient motion features between targets and USVs, which reflects how the targets react to the USV movements. Third, the intent recipient feature extraction module (IRFEM) is proposed to extract the assigned subtask features of the targets to reflect which USV is the intended recipient of each target’s actions. Then, the collaborative feature extraction module (CFEM), based on a GNN, is designed to dynamically extract the cooperative features of targets using dynamic attention from a long-term sequence of actions, which reflects the way that targets cooperate to accomplish a certain intention. We present the first public benchmark dataset for testing the performance of methods to recognize the intention of a group of sea-surface targets. Experimental results indicate that the accuracy (Acc), precision (Pre), recall (Rec), the {F}1 -score ( {F}1 ) of the proposed method are 0.9666, 0.9675, 0.9666, and 0.9668, respectively, which demonstrates the high robustness of the proposed method. The proposed m...
Published in: IEEE Sensors Journal ( Volume: 25, Issue: 3, 01 February 2025)