Show simple item record

dc.contributor.advisorPettersen, Kristin Ytterstad
dc.contributor.authorFaxvaag, Erlend
dc.date.accessioned2018-09-05T14:02:14Z
dc.date.available2018-09-05T14:02:14Z
dc.date.created2018-06-11
dc.date.issued2018
dc.identifierntnudaim:18593
dc.identifier.urihttp://hdl.handle.net/11250/2561057
dc.description.abstractAutonomous vehicles have numerous advantages compared to standard vehicles. They can reduce fuel consumption, reduce injuries and death, optimize mobility, and reduce traffic congestion. Most lane assists used in consumer cars today are built up by several modules and can be complex and non-general, and few of these can handle dirt roads. Change in weather and road types can cause drastic and unwanted effects on the performance and safety of the lane assist. The end-to-end approach for autonomous vehicles has shown promising results in the later years. It uses a front-facing camera on a vehicle that feeds images of the road through a CNN which then maps each image directly to a steering angle. End-to-end networks have the advantage that there is no need for manually designing rules. The network is trained using supervised learning by cloning the behavior of human maneuvers. It is trained to generalize its perception of the road so it can predict accurately regardless of weather and road conditions. This thesis proposes a multi-input end-to-end network for dirt roads that combines camera images and LiDAR data in an attempt to outperform single-input end-to-end networks. Using a test set, the experiment proved that combining camera images and LiDAR outperforms camera or LiDAR alone. Multi-input networks can, therefore, improve the local navigation of an off-road autonomous UGV. Additionally, a path verification technique is presented. It uses a segmentation network to segment out the dirt road, and together with the predicted steering angle, it evaluates the local path of the vehicle.
dc.languageeng
dc.publisherNTNU
dc.subjectKybernetikk og robotikk (2 årig), Tilpassede datasystemer
dc.titleDeep Convolutional Networks for Steering an Off-Road Unmanned Ground Vehicle - End-To-End Learning and Sensor Fusion
dc.typeMaster thesis


Files in this item

Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record