Towards autonomous underwater navigation and perception for end-to-end ship hull inspection
Doctoral thesis
Permanent lenke
https://hdl.handle.net/11250/3121979Utgivelsesdato
2024Metadata
Vis full innførselSamlinger
- Institutt for marin teknikk [3397]
Sammendrag
This thesis presents an innovative and integrated solution for end-to-end underwater ship hull inspection using a small and low-cost Remotely Operated Vehicle (ROV). In a world where maritime activities have significant and distinctive impacts in many sectors, where over 5000 ships operate daily, safety concerns from the structural integrity of the hulls are raised. These concerns are not only related to the ship’s crew safety if there are apparent damages to the hull, but also related to the environmental cause. Indeed, an unmaintained hull will provoke a significant rise of the fuel consumption over time. Although successful, traditional inspection methods in dry docks fail to be fast-paced, which would decrease the ship’s down time and cost. Remote inspections are promising to address this issue. ROV-based inspections enable efficient visual documentation while the ship is still in water and docked. Further automating the process increases the efficiency since it brings consistency and faster data processing. With the inspection culture and regulations in mind, this thesis aims to achieve a fully automated inspection of underwater ship hulls, i.e., from the deployment of the vehicle to the assistance of the surveyor to generate the inspection reports. To achieve this, the drone is equipped with a set of navigation and perception sensors to ensure hull relative navigation and guarantee full visual coverage. Maneuvering based guidance is employed to navigate along the hull. Its relative orientation and distance to the ROV is computed using a forward looking sonar and set as constraints to the guidance mechanism to make sure the vehicle is facing the hull at a constant distance. Additionally, the sonar enables online acoustic mapping of the hull, tracking of the inspection progress, and visual representations of the hull. This relies on a flat surface assumption when operating at close range. When inspecting particular points of interests such as propellers, keels, and gratings, acoustic and optical data are combined to provide a better understanding of the structure through accurate 3D modelling of the scene and improved localisation of the vehicle. The acoustic-visual combination occurs at the feature level based on the relative distances of the detected points to the perception sensors. To constrain the search space of the features that can be matched, the intersection area between the sonar acoustic beams and the camera image plane is dynamically estimated. The acoustic-visual combination is activated when the specific areas of interest are detected. This is done through the use of deep learning models, trained on a tailor made dataset for image classification and semantic segmentation of ship parts and faults. Verified by domain experts, this dataset was made to match the needs of the surveyors and is the first of its kind publicly available for ship hull inspection systems. Sequences of the data collected by the vehicle during the mission are automatically marked based on their relevance for the inspection to further assist the surveyor. These data markers are attached with visual data, models and the ROV telemetry to provide insights and to be compatible with the guidelines from the international regulations. The complete solution was tested in ten harbors and on six ships of different size and strutures to ensure the adaptability of the methods and consistency of the results. By taking advantage of the available sensors, it was possible to move along the hull with high precision at the same time as mapping it. The proposed methods outperformed the related existing one and showed new promising opportunities for future research. Finally, the adaptability of the proposed solution made it possible to apply it for inspection of different structures than ship hulls, including aquaculture fish net pens and subsea structures.
Består av
Paper A: Cardaillac, Alexandre; Amundsen, Herman Biørn; Kelasidi, Eleni; Ludvigsen, Martin. Application of Maneuvering Based Control for Autonomous Inspection of Aquaculture Net Pens. I: 2023 8th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS). IEEE conference proceedings https://doi.org/10.1109/ACIRS58671.2023.10239708Paper B: Scheiber, Martin; Cardaillac, Alexandre; Brommer, Christian; Weiss, Stephan; Ludvigsen, Martin. Modular Multi-Sensor Fusion for Underwater Localization for Autonomous ROV Operations. I: OCEANS 2022 Hampton Roads https://doi.org/10.1109/OCEANS47191.2022.9977298
Paper C: Hirsch, Joseph; Elvesæter, Brian; Cardaillac, Alexandre; Bauer, Bernhard; Waszak, Maryna. Fusion of Multi-Modal Underwater Ship Inspection Data with Knowledge Graphs. I: OCEANS 2022 Hampton Roads. https://doi.org/ 10.1109/OCEANS47191.2022.9977371
Paper D: Cardaillac, Alexandre; Ludvigsen, Martin. Marine Snow Detection for Real Time Feature Detection. Proceedings of the Symposium on Autonomous Underwater Vehicle Technology 2022 IEEE/OES https://doi.org/10.1109/AUV53081.2022.9965895
Paper E: Cardaillac, Alexandre; Ludvigsen, Martin. A Communication Interface for Multilayer Cloud Computing Architecture for Low Cost Underwater Vehicles. IFAC-PapersOnLine 2022 ;Volum 55.(14) s. 77-82 https://doi.org/10.1016/j.ifacol.2022.07.586 This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/)
Paper F: Cardaillac, Alexandre; Ludvigsen, Martin. Path Following for Underwater Inspection allowing Manoeuvring Constraints. I: Intelligent Autonomous Systems 17. IAS 2022. Lecture Notes in Networks and Systems, vol 577. Springer, Cham. https://doi.org/10.1007/978-3-031-22216-0_58
Paper G: Cardaillac, Alexandre; Ludvigsen, Martin. Ruled Path Planning Framework for Safe and Dynamic Navigation. I: OCEANS 2021: San Diego – Porto https://doi.org/10.23919/OCEANS44145.2021.9705699
Paper H: Cardaillac, Alexandre; Ludvigsen, Martin. Camera-Sonar Combination for Improved Underwater Localization and Mapping. IEEE Access 2023 ;Volum 11. s. 123070-123079 https://doi.org/10.1109/ACCESS.2023.3329834 This work is licensed under a Creative Commons Attribution 4.0 License.CCBY
Paper I: Cardaillac, Alexandre; Skjetne,Roger; Ludvigsen, Martin. ROV-Based Autonomous Maneuvering for Ship Hull Inspection with Coverage Monitoring
Paper J: Waszak, Maryna; Cardaillac, Alexandre; Elvesæter, Brian; Rødølen, Frode; Ludvigsen, Martin. Semantic Segmentation in Underwater Ship Inspections: Benchmark and Data Set. IEEE Journal of Oceanic Engineering 2022 ;Volum 48.(2) s. 462-473 https://doi.org/10.1109/JOE.2022.3219129 This work is licensed under a Creative Commons Attribution 4.0 License.CCBY