Loading [MathJax]/extensions/MathMenu.js
Mobile Parcels' Grasping Detection System by the Neuromorphic Vision and Efficient Fusion Network | IEEE Journals & Magazine | IEEE Xplore

Mobile Parcels' Grasping Detection System by the Neuromorphic Vision and Efficient Fusion Network


Abstract:

The increasing popularity of online shopping has resulted in a surge of parcels that need to be sorted, which exerts great challenges to the sorting work. Robotic grasp c...Show More

Abstract:

The increasing popularity of online shopping has resulted in a surge of parcels that need to be sorted, which exerts great challenges to the sorting work. Robotic grasp can greatly improve the sorting efficiency. However, the dynamic grasp of moving parcels requires higher detection speed and grasping pose calculation accuracy. To address these requirements, this study proposes a new grasping system through the Neuromorphic vision (NeuroIV), which owns the advantages of low latency and lightweight computing. As a young field, Neuromorphic camera is rarely used in robotic grasp. In view of this, we present a novel parcel-grasping dataset. After that, a double channels' down-sampling and grasping network (DCDG-Net) is designed, which can extract abundant features with ResNet and transformer branches, respectively. To mitigate the calculation burden introduced by the dual channels' network, we design a feature-vector multiplication to replace the dot-product multiplication, thereby reducing the computational load among different matrixes. Furthermore, channel and space attentions are fused to construct multidimensional network to suppress noisy features and highlight useful information. Finally, we have evaluated the proposed method in real-world scenarios. Together with qualitative and quantitative comparisons, this work provides a state-of-the-art grasping detection with the new NeuroIV dataset and network.
Published in: IEEE Transactions on Mobile Computing ( Early Access )
Page(s): 1 - 14
Date of Publication: 01 April 2025

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe