Loading web-font TeX/Main/Regular
MBSeg: Real-Time Contour-Based Method Instance Segmentation for Egocentric Vision | IEEE Journals & Magazine | IEEE Xplore

MBSeg: Real-Time Contour-Based Method Instance Segmentation for Egocentric Vision


Abstract:

The instance segmentation task provides perceptual intelligence for Internet of Things (IoT) devices by identifying various objects in complex environments. However, achi...Show More

Abstract:

The instance segmentation task provides perceptual intelligence for Internet of Things (IoT) devices by identifying various objects in complex environments. However, achieving real-time performance on resource-constrained edge IoT devices poses a challenge due to the complexity of instance segmentation tasks. In this article, we present a contour-based segmentation approach utilizing macroblocks (MBs), termed MBSeg. We utilize novel data, the MBs of H.265 videos, to guide object segmentation. Thanks to the prevalence of video encoding and decoding chips, this data is lightweight, fast, and easily accessible. MBSeg is a two-stage method. First, a lightweight object detection network acquires object positions and extracts features. Subsequently, we generate rough object contours from MBs, which are input into MBSnake, an active contour model optimized for edge devices, for further deformation. We introduce angle value evaluation of vertices to balance MBSnake’s accuracy and speed. We have implemented and validated MBSeg on commercial Android devices. Results demonstrate that MBSeg achieves 30 frames per second throughput on self-centric videos with multiple objects, a 6.4\times speedup over YolactEdge, an edge instance segmentation method, with only a 13.8% drop in accuracy. This advancement provides a valuable foundation for self-centric visual wearable IoT.
Published in: IEEE Internet of Things Journal ( Volume: 11, Issue: 12, 15 June 2024)
Page(s): 21486 - 21498
Date of Publication: 19 March 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.