Building detection in aerial images using fusion of U-Net and ResNet architectures | IEEE Conference Publication | IEEE Xplore

Building detection in aerial images using fusion of U-Net and ResNet architectures


Abstract:

Semantic segmentation describes the procedure of giving each pixel in an image a unique class name. With the increased accessibility of high-resolution remote sensing dat...Show More

Abstract:

Semantic segmentation describes the procedure of giving each pixel in an image a unique class name. With the increased accessibility of high-resolution remote sensing data, there has been a tremendous increase in interest of urban feature segmentation such as buildings, roads, bridges and railways. Detection of buildings is of great interest in the field of urban planning, disaster management and 3D modelling. Deep learning algorithms are showing positive results such as greater accuracy and are effective in building detection purpose even in challenging conditions. In this study, we have implemented building detection using a deep learning CNN architecture named as Residual Network (ResNet) as a backbone model. U-Net model has been used as base model for building detection purpose for Massachusetts Buildings Dataset. In this paper, initially the aerial image dataset has been trained on U-net model for 10 epochs having 12 batch size and binary cross-entropy-focal-dice loss as loss function. During the evaluation of the aerial image dataset, various ResNet architectures were used as backbone models. Mean Intersection over Union (MeanIoU), Intersection over Union (IoU) score, accuracy, precision, recall, and F1 score were the evaluation measures used. These metrics were assessed by utilizing the same loss functions across all the ResNet architectures. This finding indicates that, when compared to alternative ResNet architectures, the ResNet-101 model offers significantly better results. According to Resnet101, MeanIoU is 72.67%, and IoU score is 87.03%, while accuracy, precision, recall, and F1 score is 93.06%, 86.61%, and 92.91% respectively.
Date of Conference: 06-08 July 2023
Date Added to IEEE Xplore: 23 November 2023
ISBN Information:

ISSN Information:

Conference Location: Delhi, India

Contact IEEE to Subscribe

References

References is not available for this document.