Vis enkel innførsel

dc.contributor.advisorDavies, Catharina de Lange
dc.contributor.advisorLangø, Thomas
dc.contributor.authorØstvik, Andreas
dc.date.accessioned2016-09-19T14:01:11Z
dc.date.available2016-09-19T14:01:11Z
dc.date.created2016-06-30
dc.date.issued2016
dc.identifierntnudaim:14444
dc.identifier.urihttp://hdl.handle.net/11250/2408442
dc.description.abstractIntroducing autonomous robot systems in clinical medicine is deemed extremely challenging due to the complex scene involved and variations between patients. The majority of commercialized systems are controlled directly in a local telesurgical mode, requiring a high degree of human interaction. Research on robot control, vision and image-guided interventions has facilitated new integration possibilities, with potential of increased autonomy in multiple stages of patient care. The work presented in this thesis is devoted to research and development of a framework for robot manipulation integrated with existing methods used in image-guided interventions. The framework employs the research platform CustusX, which is an open-source navigation software with implemented functionality aimed towards interventional use. Both CustusX and the robot framework is written in the programming language C++ utilizing several external libraries. The systems are well integrated creating a original nexus for robot manipulation and state of the art solutions for image-guided interventions. Automatic calibration routines for spatially relating the robot manipulator with existing tools used for image-guided interventions is developed. The positional accuracy is in the sub-millimetre range, making it suitable for several clinical applications. Furthermore, the integrated framework offers a range of functionalities and extension points. This includes robot manipulation based on external input sources such as physical pointing instruments, but also interaction with patients registered with preoperative data using an implemented user interface. In addition, a vision-based robot control system using ultrasound is developed, to allow autonomous robot motion interpreting information obtained from image analysis. To investigate the performance of the implementations, several verification experiments are conducted with the robot manipulator UR5 from Universal Robots, together with other tools present in a typical operating scene. Evaluations have demonstrated that the integrated robot system has satisfying performance with extension possibilities towards clinical applications.
dc.languageeng
dc.publisherNTNU
dc.subjectFysikk og matematikk, Biofysikk og medisinsk teknologi
dc.titleRobot Control in Image-Guided Intervention
dc.typeMaster thesis
dc.source.pagenumber99


Tilhørende fil(er)

Thumbnail
Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel