Vis enkel innførsel

dc.contributor.advisorSchjølberg, Ingrid
dc.contributor.advisorUtne, Ingrid Bouwer
dc.contributor.advisorHaugaløkken, Bent. O. A.
dc.contributor.authorSkaldebø, Martin Breivik
dc.date.accessioned2023-02-02T07:05:45Z
dc.date.available2023-02-02T07:05:45Z
dc.date.issued2023
dc.identifier.isbn978-82-326-5480-2
dc.identifier.issn2703-8084
dc.identifier.urihttps://hdl.handle.net/11250/3047839
dc.description.abstractThis thesis considers intelligent solutions that facilitates for autonomous technology in underwater intervention and navigation. A special focus have been on implementing methods and solutions in inspection, maintenance, and repair (IMR) operations using low cost equipment. The presented work involves development and implementation of solutions to increase the efficiency and safety of operations, and includes both theoretical contributions and experimental testing. The work includes learning algorithms to improve visual authenticity of simulators and digital twin scenarios, computer vision in guidance and navigation of underwater vehicles and intervention systems, development and testing of novel equipment, and experimental verification of presented methods and equipment. The introduction of advanced learning algorithms enables systems to perform tasks that was previous too complex and complicated for any modelled solutions. This thesis explores the use of generative adversarial networks to improve the realism of simulated environments, which again will improve the transferred learning between simulated environments and real world operations. Such a mapping between domains is complex to model, especially in the underwater scene given the intricate scenery with scattering of light and marine particles. Machine learning algorithms provides new solutions for this mapping, and can aid in improving result from simulation tools to have greater impact on real world operations. In the same way humans uses their senses to experience life, autonomous systems requires sensory feedback to act and react upon. A sensor is only as effective as the information that can be extracted from the sensory output, and increasing and strengthening this information will improve the support from the sensor. Camera footage contain information with higher spatial and temporal resolution than acoustic information, and utilizing this information to its full will improve today's sensory systems. This thesis explores the use of visual aid and the potential it brings to increase autonomous capabilities of underwater vehicles and intervention systems. The explored methods includes object detection and tracking networks trained on custom dataset that are able to locate objects within the camera frame. Robust localization of objects is vital in intervention operations to perform autonomous grasping efficiently and safe. The area of focus for this thesis has been the research and development of low cost solutions for intervention using machine learning. The platform for testing has been the SeaArm-2 manipulator, which is a small electric modular manipulator with an integrated end-effector camera. Part of the thesis work has included development and assembly of the manipulator which is designed to enable both remote control and is optimized for autonomous operations, where the camera enables visual aid to be used as the main sensor input in autonomous grasping operations. Object detectors and trackers in combination with mathematical 3D feature extractors provides a robust system capable of locating the relative 3D position of objects, or features, of interest. A large part of this thesis is dedicated to experimental verification of methods using deep learning and visual aid. This is implemented on the SeaArm-2 manipulator and an underwater vehicle for experimental testing of autonomous functionalities. Experimental testing is important in order to verify the proposed methods and solutions, and in combination with improving results from simulations, and development of digital twins, this thesis provides a good platform for advancing towards a more autonomous tomorrow.en_US
dc.language.isoengen_US
dc.publisherNTNUen_US
dc.relation.ispartofseriesDoctoral theses at NTNU;2023:27
dc.relation.haspartPaper 1: Skaldebø, Martin Breivik; Sans-Muntadas, Albert; Schjølberg, Ingrid. Transfer Learning in Underwater Operations. I: Proceedings of OCEANS 2019 - Marseille. IEEE 2019 https://doi.org/10.1109/OCEANSE.2019.8867288
dc.relation.haspartPaper 2: Skaldebø, Martin Breivik; Haugaløkken, Bent Oddvar Arnesen; Schjølberg, Ingrid. Dynamic Positioning of an Underwater Vehicle using Monocular Vision-Based Object Detection with Machine Learning. I: Proceedings of OCEANS 2019 MTS/IEEE SEATTLE. IEEE 2019 https://doi.org/10.23919/OCEANS40490.2019.8962583
dc.relation.haspartPaper 3: Haugaløkken, Bent Oddvar Arnesen; Skaldebø, Martin Breivik; Schjølberg, Ingrid. Monocular vision-based gripping of objects. Robotics and Autonomous Systems 2020 ;Volum 131. https://doi.org/10.1016/j.robot.2020.103589 This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
dc.relation.haspartPaper 4: Skaldebø, Martin Breivik; Haugaløkken, Bent Oddvar Arnesen; Schjølberg, Ingrid. SeaArm-2 - Fully electric underwater manipulator with integrated end-effector camera. I: 2021 European Control Conference, ECC 2021. IEEE conference proceedings 2021 s. 236-242 https://doi.org/10.23919/ECC54610.2021.9655121
dc.relation.haspartPaper 5: Sans-Muntadas, Albert; Skaldebø, Martin Breivik; Nielsen, Mikkel Cornelius; Schjølberg, Ingrid. Unsupervised Domain Transfer for Task Automation in Unmanned Underwater Vehicle Intervention Operations. IEEE Journal of Oceanic Engineering 2022 ;Volum 47.(2) s. 312-321 https://doi.org/10.1109/JOE.2021.3126016
dc.relation.haspartPaper 6: Skaldebø, Martin Breivik; Haugaløkken, Bent Oddvar Arnesen; Schjølberg, Ingrid. Autonomous underwater grasping using a novel vision-based distance estimator
dc.relation.haspartPaper 7: Skaldebø, Martin Breivik; Schjølberg, Ingrid; Haugaløkken, Bent Oddvar Arnesen. Underwater Vehicle Manipulator System (UVMS) With BlueROV2 and SeaArm-2 Manipulator. I: ASME 2022 41st International Conference on Ocean, Offshore and Arctic Engineering Volume 5B: Ocean Engineering; Honoring Symposium for Professor Günther F. Clauss on Hydrodynamics and Ocean Engineering. Paper No: OMAE2022-79913, V05BT06A022; s. 1-8 https://doi.org/10.1115/OMAE2022-79913
dc.relation.haspartPaper 8: Skaldebø, Martin Breivik; Schjølberg, Ingrid. Dynamic Bayesian Networks for Reduced Uncertainty in Underwater Operations. IFAC-PapersOnLine 2022 ;Volum 55. s. 409-414 https://doi.org/10.1016/j.ifacol.2022.10.462 This is an open access article under the CC BY-NC-ND license.
dc.relation.haspartPaper 9: Transeth, Aksel Andreas; Schjølberg, Ingrid; Lekkas, Anastasios M.; Risholm, Petter; Mohammed, Ahmed Kedir; Skaldebø, Martin Breivik; Haugaløkken, Bent Oddvar Arnesen; Bjerkeng, Magnus Christian; Tsiourva, Maria Efstathia; Py, Frédéric. Autonomous subsea intervention (SEAVENTION). 14th IFAC Conference on Control Applications in Marine Systems, Robotics and Vehicles (CAMS 2022) https://doi.org/10.1016/j.ifacol.2022.10.459 This is an open access article under the CC BY-NC-ND license
dc.relation.haspartPaper 10: Skaldebø, Martin Breivik; Schjølberg,Ingrid; Haugaløkken, Bent Oddvar Arnesen System integration of underwater vehicle manipulator system (UVMS) for autonomous grasping
dc.titleIntelligent low-cost solutions for underwater intervention using computer vision and machine learningen_US
dc.typeDoctoral thesisen_US
dc.subject.nsiVDP::Technology: 500::Marine technology: 580en_US


Tilhørende fil(er)

Thumbnail
Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel