Mobile Autonomous Robot: Remote Operation
MetadataShow full item record
This thesis aims to develop a concept for unifying previous solutions on the topic of robotized maintenance and remote operation, as well as exploring possible solutions for a stereo vision system to facilitate the development of a collision avoidance system for use with the robot manipulator, which is a part of the robot system. There has also been a development of a concept for an Operator Control Station, intended to give the user remote presence when executing various tasks and production of a user manual for the whole system. The system framework is based on Robot Operating System, serving as a server on the on-board computer. The whole system is divided into a server and a client, where the server is located on the mobile robot and the client is a remote computer acting as an Operator Control Station. The server and client communicate wirelessly through a LAN. The robot manipulator, a SCORBOT-ER 4u manufactured by Intelitek, makes use of proprietary software and hardware for communication with and control from a computer, making the manipulator non-compatible with operating systems other than Microsoft Windows. Thus there was a need for an interface in order to link the manipulator to the rest of the ROS system. To this end, MATLAB was used to create an interfacing node for handling commands sent to and from the robot manipulator. A stereo vision system for producing depth maps was explored, using the OpenCV library and two TP-Link IP cameras, which were the camera setup that was available at the time of writing this thesis. The purpose of this was to find a viable solution for facilitating a collision avoidance system for the robot manipulator. The Operator Control Station was developed on the Qt framework, and was a continuation of previous attempts for implementing a graphical user interface. It is divided into two modes, the Drive Mode, which is intended to facilitate a previous solution for autonomous mapping and localization using SLAM, and the Manipulator Mode, which shows the control of the robot manipulator with a joystick. Sensory feedback is given in both modes, and consists of direct camera feeds and produced maps. Lastly a user manual, called MAR User Manual, was written. It is produced as a stand-alone document, and is meant to lighten the burden of research when starting new projects on the system.