RailNet: A Segmentation Network for Railroad Detection | IEEE Journals & Magazine | IEEE Xplore

RailNet: A Segmentation Network for Railroad Detection


RailNet provides an end-to-end solution that combines feature extraction and segmentation, which do not need to generate a large number of regions. We find that use multi...

Abstract:

Future trains will use more computer vision aids to help achieve fully autonomous driving. One of the most important parts of the train’s visual function is the detection...Show More

Abstract:

Future trains will use more computer vision aids to help achieve fully autonomous driving. One of the most important parts of the train’s visual function is the detection of railroad obstacles. This makes it important to identify and segment the railroad region within each video frame as it allows the train to identify the driving area so that it can do effective obstacle detection. Traditional railroad detection methods rely on hand-crafted features or highly specialized equipment such as lidar, which typically require expensive equipment to be maintained and are less reliable in scene changes. RailNet is a deep learning segmentation algorithm for railroad detection for videos captured by the front-view on-board cameras. RailNet provides an end-to-end solution that combines feature extraction and segmentation. We have modified the backbone network to extract multi-convolution features and use a pyramid structure to make the features have a top-to-bottom propagation. Our model can detect the railroad without generating large numbers of regions, which greatly increases the detection speed. Tested on a railroad segmentation dataset (RSDS) which we have built, RailNet exhibits very good performance while achieving 20 frames per second processing speed.
RailNet provides an end-to-end solution that combines feature extraction and segmentation, which do not need to generate a large number of regions. We find that use multi...
Published in: IEEE Access ( Volume: 7)
Page(s): 143772 - 143779
Date of Publication: 04 October 2019
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.