Vis enkel innførsel

dc.contributor.advisorEgeland, Olavnb_NO
dc.contributor.authorSolberg, Eirik Anfindsennb_NO
dc.date.accessioned2014-12-19T12:21:25Z
dc.date.available2014-12-19T12:21:25Z
dc.date.created2014-09-10nb_NO
dc.date.issued2014nb_NO
dc.identifier745627nb_NO
dc.identifierntnudaim:11410nb_NO
dc.identifier.urihttp://hdl.handle.net/11250/241009
dc.description.abstractThe main objective of this project was to create a visual system for object tracking, and implement this on the new robotic lab at the Department of Production and Quality Engineering.There are used two identical KUKA Agilus manipulators with six rotating axes. The robot kinematics with the corresponding Denavit-Hartenberg representation are presented. The position based visual servoing method, used for robotic control, is shown with control loop and relations between the different coordinate bases and frames in the system.Computer vision is a large part of this thesis. The different camera parameters and how to obtain them are explained. The results show the importance of accurate camera calibration, and that one should avoid the temptation to leave out some parameters from the equation. This can cause large errors in the measurements.The SIFT object detection method is explained, and the performance is compared with another method named SURF. The tests show that while SURF is faster than SIFT, it is outperformed when it comes to the robustness of the algorithm. SIFT was therefore chosen for implementation at the lab.With the object detected, the manipulators movement needs to be calculated in order to position the camera in the desired position, which is 300mm perpendicular to the object. The algorithm calculates the rotational and translational offset between the current camera position and the desired camera pose. A proportional regulator is then applied to calculate the next small step on the desired trajectory for the Agilus manipulator.The practical setup of the robot cell is explained with each step needed in order to have a working vision system. The information flow in the system can be chaotic, and a graphic representation is therefore developed and show all steps from image capturing, to robotic movement, and plotting of the trajectories. Results are presented as plots for both distance calculations and the actual movement of the manipulator. The visual system tracks and follows the object successfully. There are however some issues with variations in the output from the object detection algorithm. This cause variations in the signal used as reference for the robot. A filter was able to reduce these variations, but not eliminate them. Possible solutions are presented and are believed to improve the speed and accuracy of the system if further investigated.nb_NO
dc.languageengnb_NO
dc.publisherInstitutt for produksjons- og kvalitetsteknikknb_NO
dc.titleVision Based Robotic Controlnb_NO
dc.typeMaster thesisnb_NO
dc.source.pagenumber100nb_NO
dc.contributor.departmentNorges teknisk-naturvitenskapelige universitet, Fakultet for ingeniørvitenskap og teknologi, Institutt for produksjons- og kvalitetsteknikknb_NO


Tilhørende fil(er)

Thumbnail
Thumbnail
Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel