Vis enkel innførsel

dc.contributor.authorBosdelekidis, Vasileios
dc.contributor.authorJohansen, Tor Arne
dc.contributor.authorSokolova, Nadezda
dc.contributor.authorBryne, Torleiv Håland
dc.date.accessioned2024-02-14T11:49:27Z
dc.date.available2024-02-14T11:49:27Z
dc.date.created2023-06-01T15:26:53Z
dc.date.issued2023
dc.identifier.citationIEEE - ION Position Location and Navigation Symposium. 2023, (D4a: AI-Enhanced Navigation Systems), 81-92.en_US
dc.identifier.issn2153-358X
dc.identifier.urihttps://hdl.handle.net/11250/3117520
dc.description.abstractThis article introduces a method utilizing deep learning in typical Multiple Hypothesis Solution Separation (MHSS)-based Integrity Monitors (IMs) of autonomous vehicle navigation systems, when conventional sensors, such as GNSS and Inertial Measurement Unit (IMU), are integrated with a camera. It is an innovative methodology to reduce the hypothesis space of sensor faults. In the proposed method, the measurement subsets to evaluate in MHSS are generated from the IMU/GNSS measurement set only, so that Fault Detection and Exclusion (FDE) in camera measurements takes place separately. In the investigated approach, anomaly prediction in state estimate error due to camera faults is performed based on raw images with a Deep Neural Network (DNN), and IM input is required only during the online refinement of predicted anomaly locations to reflect anomalies in the IM test statistic. This opens for the possibility to evaluate environment features and conditions that cause specific detected or undetected sensor faults. Experiments on the IMU/GNSS/Camera integration demonstrated that Protection Level (PL) bounding performance of the proposed IM, with limited hypothesis space and the individual camera FDE, is comparable to the MHSS IM informed by the full set of fault hypotheses. Despite the use case of the camera, the method can be directly extended to integrations with multiple auxiliary sensors, where each auxiliary sensor is evaluated individually for faults.en_US
dc.description.abstractSolution Separation-Based Integrity Monitor for Integrated GNSS/IMU/Camera Navigation: Constraining the Hypothesis Space With Deep Learningen_US
dc.language.isoengen_US
dc.publisherIEEEen_US
dc.rightsNavngivelse 4.0 Internasjonal*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/deed.no*
dc.subjectNavigasjonen_US
dc.subjectNavigationen_US
dc.titleSolution Separation-Based Integrity Monitor for Integrated GNSS/IMU/Camera Navigation: Constraining the Hypothesis Space With Deep Learningen_US
dc.title.alternativeSolution Separation-Based Integrity Monitor for Integrated GNSS/IMU/Camera Navigation: Constraining the Hypothesis Space With Deep Learningen_US
dc.typePeer revieweden_US
dc.typeJournal articleen_US
dc.description.versionacceptedVersionen_US
dc.subject.nsiVDP::Informasjons- og kommunikasjonsteknologi: 550en_US
dc.subject.nsiVDP::Information and communication technology: 550en_US
dc.source.pagenumber81-92en_US
dc.source.journalIEEE - ION Position Location and Navigation Symposiumen_US
dc.source.issueD4a: AI-Enhanced Navigation Systemsen_US
dc.identifier.doi10.1109/PLANS53410.2023.10140047
dc.identifier.cristin2150934
dc.relation.projectNorges forskningsråd: 305051en_US
cristin.ispublishedtrue
cristin.fulltextpostprint
cristin.qualitycode1


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel

Navngivelse 4.0 Internasjonal
Med mindre annet er angitt, så er denne innførselen lisensiert som Navngivelse 4.0 Internasjonal