Multimodal End-to-End Learning for Autonomous Steering in Adverse Road and Weather Conditions

Show simple item record Maanpää, Jyri Taher, Josef Manninen, Petri Pakola, Leo Melekhov, Iaroslav Hyyppä, Juha 2021-06-29T19:59:37Z 2021-06-29T19:59:37Z 2021
dc.identifier.isbn 978-1-7281-8808-9
dc.identifier.isbn 978-1-7281-8809-6
dc.identifier.issn 1051-4651
dc.description.abstract Autonomous driving is challenging in adverse road and weather conditions in which there might not be lane lines, the road might be covered in snow and the visibility might be poor. We extend the previous work on end-to-end learning for autonomous steering to operate in these adverse real-life conditions with multimodal data. We collected 28 hours of driving data in several road and weather conditions and trained convolutional neural networks to predict the car steering wheel angle from front-facing color camera images and lidar range and reflectance data. We compared the CNN model performances based on the different modalities and our results show that the lidar modality improves the performances of different multimodal sensor-fusion models. We also performed on-road tests with different models and they support this observation. fi
dc.language.iso en fi
dc.publisher IEEE fi
dc.relation.ispartofseries Proceedings of ICPR 2020: 25th International Conference on Pattern Recognition, Milan, 10 – 15 January 2021 fi
dc.subject Reflectivity fi
dc.subject Laser radar fi
dc.subject Roads fi
dc.subject Snow fi
dc.subject Wheels fi
dc.subject Training data fi
dc.subject Sensor fusion fi
dc.title Multimodal End-to-End Learning for Autonomous Steering in Adverse Road and Weather Conditions fi
dc.type Article fi

Files in this item

Total number of downloads: Loading...

Files Size Format View
2021Maanpää46358.pdf 5.983Mb PDF View/Open

This item appears in the following Collection(s)

Show simple item record