Semantic Segmentation for Robotic Apple Harvesting: A Deep Learning Approach Leveraging U-Net, Synthetic Data, and Domain Adaptation | IEEE Conference Publication | IEEE Xplore

Semantic Segmentation for Robotic Apple Harvesting: A Deep Learning Approach Leveraging U-Net, Synthetic Data, and Domain Adaptation


Abstract:

This paper introduces a deep learning-based semantic segmentation framework tailored for robotic apple harvesting, leveraging synthetic data generated within a 3D simulat...Show More

Abstract:

This paper introduces a deep learning-based semantic segmentation framework tailored for robotic apple harvesting, leveraging synthetic data generated within a 3D simulated apple orchard. The proposed simulation environment replicates real-world scenarios, encompassing challenges such as occlusion, variety in apple types, and changes in lighting conditions. This approach eliminates the extensive costs and complexities associated with collecting real-world datasets, particularly in unpredictable agricultural settings. The synthetic dataset, rendered from perspectives consistent with a robotic harvester's camera in the Gazebo physics engine, provides a comprehensive range of scenarios for robust model training. For validation, we deploy U-Net, a fully convolutional neural network, emphasizing its adaptability to domain shifts between synthetic and real-world data. By integrating strategies such as domain adaptation, data augmentation, and the inclusion of pre-trained ResNet-50 encoders in the U-Net framework, we demonstrate superior performance in detecting and segmenting apples in diverse real-world conditions compared to standard U-Net models and traditional computer vision techniques. The results highlight the potential of synthetic data in deep learning-based semantic segmentation, offering a cost-effective and scalable solution when real-world data is limited or hard to collect.
Date of Conference: 24-27 June 2024
Date Added to IEEE Xplore: 26 July 2024
ISBN Information:
Conference Location: New York, NY, USA

Contact IEEE to Subscribe

References

References is not available for this document.