Show simple item record

dc.contributor.advisorGravdahl, Jan Tommy
dc.contributor.authorGundersen, Vetle Bjørngaard
dc.date.accessioned2018-08-17T14:01:57Z
dc.date.available2018-08-17T14:01:57Z
dc.date.created2018-06-09
dc.date.issued2018
dc.identifierntnudaim:18643
dc.identifier.urihttp://hdl.handle.net/11250/2558432
dc.description.abstractThis Master s thesis proposes a novel implementation of an autonomous tracker in Python, which combines a deep learning detection module and a point based tracking module. An accurate detection will introduce latency if the video capture rate exceeds the processing rate. The use of a frame buffer, a key element of the combination design, will compensate for this weakness. All frames periodically skipped by the detector will be stored, and a fast tracker will process the buffer to provide an updated object prediction for the current frame. The system implementation is developed with focus on future deployment on a Nvidia Jetson TX2 embedded platform, and utilizes Google s TensorFlow object detection API and the OpenCV object tracking API. The autonomous tracker is evaluated on a number of relevant videos, with a hybrid measure combining the bounding box overlap and a new proposed distance error score. The final system configuration, with a lightweight neural network for detection and the median flow algorithm for tracking, show real-time performance on a quad-core CPU.
dc.languageeng
dc.publisherNTNU
dc.subjectKybernetikk og robotikk
dc.titleAutonomous Target Detection and Tracking for Remotely operated Weapon Stations
dc.typeMaster thesis


Files in this item

Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record