Multimodal End-to-End Learning for Autonomous Steering in Adverse Road and Weather Conditions

Näytä kaikki kuvailutiedot

Permalink

http://hdl.handle.net/10138/331931

Tiedostot

Latausmäärä yhteensä: Ladataan...

Tiedosto(t) Koko Formaatti Näytä
2021Maanpää46358.pdf 5.983MB PDF Avaa tiedosto
Julkaisun nimi: Multimodal End-to-End Learning for Autonomous Steering in Adverse Road and Weather Conditions
Tekijä: Maanpää, Jyri; Taher, Josef; Manninen, Petri; Pakola, Leo; Melekhov, Iaroslav; Hyyppä, Juha
Julkaisija: IEEE
Päiväys: 2021
Kieli: en
Kuuluu julkaisusarjaan: Proceedings of ICPR 2020: 25th International Conference on Pattern Recognition, Milan, 10 – 15 January 2021
ISBN: 978-1-7281-8808-9
978-1-7281-8809-6
ISSN: 1051-4651
URI: http://hdl.handle.net/10138/331931
Tiivistelmä: Autonomous driving is challenging in adverse road and weather conditions in which there might not be lane lines, the road might be covered in snow and the visibility might be poor. We extend the previous work on end-to-end learning for autonomous steering to operate in these adverse real-life conditions with multimodal data. We collected 28 hours of driving data in several road and weather conditions and trained convolutional neural networks to predict the car steering wheel angle from front-facing color camera images and lidar range and reflectance data. We compared the CNN model performances based on the different modalities and our results show that the lidar modality improves the performances of different multimodal sensor-fusion models. We also performed on-road tests with different models and they support this observation.
Avainsanat: Reflectivity
Laser radar
Roads
Snow
Wheels
Training data
Sensor fusion


Viite kuuluu kokoelmiin:

Näytä kaikki kuvailutiedot