Vis enkel innførsel

dc.contributor.authorZwilgmeyer, Peder Georg Olofsson
dc.contributor.authorYip, Mauhing
dc.contributor.authorTeigen, Andreas Langeland
dc.contributor.authorMester, Rudolf
dc.contributor.authorStahl, Annette
dc.date.accessioned2022-04-20T11:24:58Z
dc.date.available2022-04-20T11:24:58Z
dc.date.created2021-09-03T17:02:37Z
dc.date.issued2021
dc.identifier.citationIEEE International Conference on Computer Vision (ICCV). 2021, 3715-3723.en_US
dc.identifier.issn1550-5499
dc.identifier.urihttps://hdl.handle.net/11250/2991611
dc.description.abstractUnderwater visual perception requires being able to deal with bad and rapidly varying illumination and with reduced visibility due to water turbidity. The verification of such algorithms is crucial for safe and efficient underwater exploration and intervention operations. Ground truth data play an important role in evaluating vision algorithms. However, obtaining ground truth from real underwater environments is in general very hard, if possible at all.In a synthetic underwater 3D environment, however, (nearly) all parameters are known and controllable, and ground truth data can be absolutely accurate in terms of geometry. In this paper, we present the VAROS environment, our approach to generating highly realistic under-water video and auxiliary sensor data with precise ground truth, built around the Blender modeling and rendering environment. VAROS allows for physically realistic motion of the simulated underwater (UW) vehicle including moving illumination. Pose sequences are created by first defining waypoints for the simulated underwater vehicle which are expanded into a smooth vehicle course sampled at IMU data rate (200 Hz). This expansion uses a vehicle dynamics model and a discrete-time controller algorithm that simulates the sequential following of the waypoints.The scenes are rendered using the raytracing method, which generates realistic images, integrating direct light, and indirect volumetric scattering. The VAROS dataset version 1 provides images, inertial measurement unit (IMU) and depth gauge data, as well as ground truth poses, depth images and surface normal images.en_US
dc.language.isoengen_US
dc.publisherIEEEen_US
dc.titleThe VAROS Synthetic Underwater Data Set: Towards realistic multi-sensor underwater data with ground truthen_US
dc.typePeer revieweden_US
dc.typeJournal articleen_US
dc.description.versionacceptedVersionen_US
dc.rights.holder© IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.source.pagenumber3715-3723en_US
dc.source.journalIEEE International Conference on Computer Vision (ICCV)en_US
dc.identifier.doi10.1109/ICCVW54120.2021.00415
dc.identifier.cristin1931283
dc.relation.projectNorges forskningsråd: 223254en_US
dc.relation.projectNorges forskningsråd: 304667en_US
cristin.ispublishedtrue
cristin.fulltextpostprint
cristin.qualitycode1


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel