Vis enkel innførsel

dc.contributor.authorLeira, Frederik Stendahl
dc.contributor.authorHelgesen, Håkon Hagen
dc.contributor.authorJohansen, Tor Arne
dc.contributor.authorFossen, Thor I.
dc.date.accessioned2021-02-09T14:02:54Z
dc.date.available2021-02-09T14:02:54Z
dc.date.created2020-12-21T12:43:23Z
dc.date.issued2020
dc.identifier.issn1556-4959
dc.identifier.urihttps://hdl.handle.net/11250/2726950
dc.description.abstractIn this paper a multiple object detection, recognition, and tracking system for unmanned aerial vehicles (UAVs) has been studied. The system can be implemented on any UAVs platform, with the main requirement being that the UAV has a suitable onboard computational unit and a camera. It is intended to be used in a maritime object tracking system framework for UAVs, which enables a UAV to perform multiobject tracking and situational awareness of the sea surface, in real time, during a UAV operation. Using machine vision to automatically detect objects in the camera's image stream combined with the UAV's navigation data, the onboard computer is able to georeference each object detection to measure the location of the detected objects in a local North‐East (NE) coordinate frame. A tracking algorithm which uses a Kalman filter and a constant velocity motion model utilizes an object's position measurements, automatically found using the object detection algorithm, to track and estimate an object's position and velocity. Furthermore, a global‐nearest‐neighbor algorithm is applied for data association. This is achieved using a measure of distance that is based not only on the physical distance between an object's estimated position and the measured position, but also how similar the objects appear in the camera image. Four field tests were conducted at sea to verify the object detection and tracking system. One of the flight tests was a two‐object tracking scenario, which is also used in three scenarios with an additional two simulated objects. The tracking results demonstrate the effectiveness of using visual recognition for data association to avoid interchanging the two estimated object trajectories. Furthermore, real‐time computations performed on the gathered data show that the system is able to automatically detect and track the position and velocity of a boat. Given that the system had at least 100 georeferenced measurements of the boat's position, the position was estimated and tracked with an accuracy of 5–15 m from 400 m altitude while the boat was in the camera's field of view (FOV). The estimated speed and course would also converge to the object's true trajectories (measured by Global Positioning System, GPS) for the tested scenarios. This enables the system to track boats while they are outside the FOV of the camera for extended periods of time, with tracking results showing a drift in the boat's position estimate down to 1–5 m/min outside of the FOV of the camera.en_US
dc.language.isoengen_US
dc.publisherWileyen_US
dc.rightsNavngivelse-Ikkekommersiell 4.0 Internasjonal*
dc.rights.urihttp://creativecommons.org/licenses/by-nc/4.0/deed.no*
dc.titleObject Detection, Recognition and Tracking from UAVs using a Thermal Cameraen_US
dc.typePeer revieweden_US
dc.typeJournal articleen_US
dc.description.versionpublishedVersionen_US
dc.source.journalJournal of Field Robotics (JFR)en_US
dc.identifier.doi10.1002/rob.21985
dc.identifier.cristin1862341
dc.relation.projectNorges forskningsråd: 223254en_US
dc.description.localcode© 2020 The Authors. Journal of Field Robotics published by Wiley Periodicals LLC This is an open access article under the terms of the Creative Commons Attribution NonCommercial License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited and is not used for commercial purposes.en_US
cristin.ispublishedtrue
cristin.fulltextpostprint
cristin.qualitycode1


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel

Navngivelse-Ikkekommersiell 4.0 Internasjonal
Med mindre annet er angitt, så er denne innførselen lisensiert som Navngivelse-Ikkekommersiell 4.0 Internasjonal