Repository logo
 

Lane Line Detection Based on Improved U-Net Network

aut.event.date2022-07-22 to 2022-07-24en_NZ
aut.event.place, Zhuhaien_NZ
aut.researcherHutcheson, Catherine
dc.contributor.authorLi, Yen_NZ
dc.contributor.authorZhang, Sen_NZ
dc.contributor.authorWang, Yen_NZ
dc.contributor.authorMa, Jen_NZ
dc.date.accessioned2022-10-20T22:40:42Z
dc.date.available2022-10-20T22:40:42Z
dc.date.copyright2022-08-31en_NZ
dc.date.issued2022-08-31en_NZ
dc.description.abstractThe lane line detection and recognition are crucial research area for automatic driving. It aims at solving the problem of fuzzy feature expression and low time-sensitives of lane line detection based on semantic segmentation. This paper proposes to remove irrelevant background by dynamic programming region of interest while improving the lightweight neural network (U-Net). A group-by-group convolution and depth wise separable convolution in the backbone network are introduced, simplifies the branches of the backbone network, and atrous convolution is introduced into the enhanced path network with multi-level skip connection structure to retain the underlying coarse-grained semantic feature information. The full-scale skip connection fusion mechanism of the decoder is preserved, while capturing the fine-grained semantics and coarse-grained semantics of the feature map at full scale. The introduction of skip connections between the decoder and the encoder can enhance the lanes without increasing the size of the receptive field. The ability to extract line features and the ability to extract context improves the accuracy of lane lines. The experimental results show that the improved neural network can obtain good detection performance in complex lane lines, and effectively improve the accuracy and time-sensitives of lane lines.
dc.identifier.citationIn Proc. SPIE 12344, International Conference on Intelligent and Human-Computer Interaction Technology (IHCIT 2022), 1234401 (7 October 2022); doi: 10.1117/12.2659811
dc.identifier.doi10.1117/12.2659811
dc.identifier.urihttps://hdl.handle.net/10292/15537
dc.publisherSociety of Photo-optical Instrumentation Engineers (SPIE)en_NZ
dc.rights© 2022 SPIE. SPIE grants to authors (and their employers) of papers, posters, and presentation recordings published in SPIE Proceedings or SPIE Journals on the SPIE Digital Library the right to post an author-prepared version or an official version (preferred version) of the published paper, poster, or presentation recording on an internal or external repository controlled exclusively by the author/employer, or the entity funding the research, provided that (a) such posting is noncommercial in nature and the paper, poster, or presentation recording is made available to users without charge; (b) an appropriate copyright notice and citation appear with the paper, poster, or presentation recording; and (c) a link to SPIE’s official online version of the paper, poster, or presentation recording is provided using the DOI (Document Object Identifier) link.
dc.rights.accessrightsOpenAccessen_NZ
dc.subjectDriverless vehicle; Lane line recognition; Assisted driving; Image segmentation
dc.titleLane Line Detection Based on Improved U-Net Networken_NZ
pubs.elements-id463952
pubs.organisational-data/AUT
pubs.organisational-data/AUT/Faculty of Design & Creative Technologies
pubs.organisational-data/AUT/Faculty of Design & Creative Technologies/School of Engineering, Computer & Mathematical Sciences
pubs.organisational-data/AUT/PBRF
pubs.organisational-data/AUT/PBRF/PBRF Design and Creative Technologies
pubs.organisational-data/AUT/PBRF/PBRF Design and Creative Technologies/ECMS PBRF 2018

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Ma_2022_Lane Line Detection based on Improved U-Net network.pdf
Size:
770.2 KB
Format:
Adobe Portable Document Format
Description:
Conference contribution

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
AUT Grant of Licence for Tuwhera Jun 2021.pdf
Size:
360.95 KB
Format:
Adobe Portable Document Format
Description: