Loading [MathJax]/extensions/MathZoom.js
BiFormer for Scene Graph Generation Based on VisionNet With Taylor Hiking Optimization Algorithm | IEEE Journals & Magazine | IEEE Xplore

BiFormer for Scene Graph Generation Based on VisionNet With Taylor Hiking Optimization Algorithm


Abstract view of VisionNet_THOA for SGG

Abstract:

Scene Graph Generation (SGG) plays a vital role in determining the graph structure of an image by classifying objects based on their pairwise visual relationships. In the...Show More

Abstract:

Scene Graph Generation (SGG) plays a vital role in determining the graph structure of an image by classifying objects based on their pairwise visual relationships. In the SGG, visually grouped graphs are generated by considering edges as visual relationships between objects and nodes as object classes. Various schemes have been developed to generate scene graphs; however, these techniques require significant computational resources and time for the SGG. In this study, a deep learning-based optimization model, VisionNet_Taylor Hiking Optimization Algorithm (VisionNet_THOA), was introduced to generate high-quality scene graphs from noisy samples. Here, objects were detected by performing semantic segmentation using dynamic routing. The attention areas and actions were determined using the BiFormer method. The nodes in the graph are signified as detected objects and the detected action is represented by edges. Prediction head classification was performed to measure the accuracy of predicting object labels and relationships using VisionNet. The superiority of VisionNet is increased by training the hyperparameters using the Taylor hiking optimization algorithm (THOA). Furthermore, extensive experimental results were obtained using VisionNet_THOA, where VisionNet_THOA attained an accuracy of 94.867%, a True Negative Rate (TNR) of 93.877%, and a True Positive Rate (TPR) of 96.654%, a Precision of 91.765%, and F-Measure of 94.146%.
Abstract view of VisionNet_THOA for SGG
Published in: IEEE Access ( Volume: 13)
Page(s): 57207 - 57222
Date of Publication: 27 March 2025
Electronic ISSN: 2169-3536

References

References is not available for this document.