Camera-assisted Dynamic Positioning of ROVs
MetadataShow full item record
- Institutt for marin teknikk 
When ROVs are operated with manual joysticks, the user will experience a lack of control because of currents and drag forces from the umbilical. This will induce undesirable oscillations and motions that are hard to compensate for manually. With the use of computer vision, it is possible to enhance control performance in unknown waters. This can be done by finding an object that stands still on the bottom, and have the ROV track a feature on that object. The ROV should then be able to lock visually on to this "target". Now the proposed Feature Tracking algorithm is telling the system where the object is along with its size. The algorithm can be used with different combinations of manual input, depending on what degrees of freedom the user wants to control himself. The thesis also proposes a module called Range Finder, that finds the distance to the object or wall in front of the ROV. The distance estimate is obtained based on two parallel lasers that project two laser dots on the wall. The distance between these dots is then measured with use if computer vision, that counts the amount of pixels between them. This number will change depending on the distance from the wall to the ROV. A mapping is, therefore, found to give distance directly in meters instead of pixels. This module is also combined with different control modes that allow features such as automatic distance control in Wall Inspection Mode. Both modules are tested extensively with pleasing results in Marine Cybernetics Laboratory at NTNU. The test platform is a low-cost ROV called uDrone which is one of the vehicles in MC-lab. The system is running Robotic Operating System (ROS) on a Ubuntu operating system, and OpenCV as a library for computer vision.