Sensor Control in Robotic Surgery
MetadataShow full item record
This doctoral thesis investigates sensor control in robotic surgery. At the present state of the art, robotic technology for surgical applications can broadly be divided into four main classes: • Robotic assistance • Remote-/teleoperated manipulators • Autonomous and image-guided robotic surgery • Micro-/nanorobotics The thesis describes some of the most important surgical robotic systems commercially available and some of the systems that are developed at leading academic institutions and research centres. Some selected academic work related to force measurements for force feedback and force control in robotic surgery is also described. The use of sensor control in surgical robotic systems is rather limited today compared to applications in industrial robotics and underwater telemanipulators. Therefore intuitive user interfaces with autonomous and semi-autonomous features are lacking in surgical robotic systems. A more intuitive and transparent user-interface is required to succeed with telemanipulation and robotic technology in surgery. The intuitiveness in the control of the manipulator alters the brain capacity left for the surgical procedure. In this connection, the way to interact with the manipulator and the level of interaction both play a role. The level of interaction varies depending on how much of the control is given to the physical user-interaction and how much that is left to the robot itself. This thesis describes the development of two systems that will improve user comfort and by thus make robotic surgery more widely available for more surgeons: 1. The development of a guided contour tracking system is presented. This system tracks an unknown contour or surface when physical contact is established. It is based on force measurements normal to the surface that the tool is in physical contact with. This system is proposed as a solution for remote-/telemanipulation of an ultrasound probe or any other tool to help maintaining physical contact with the patient’s skin. It could also be used to hold a grinding tool at a prescribed angle relative to a surface or follow an anatomical guide like the border between soft tissue and bone. The system can operate autonomously along the contour or be controlled interactively through a steering console by changing the tracking speed, the tracking direction and orientation of the tracking tool. 2. The development of a head tracking system for intuitive control of a stereo camera held by a robot is presented. This system will alter the viewing direction of the camera as a function of the surgeon’s head movements. The head motion is measured through piezoelectric gyrosensors. In key-hole surgery systems at present the use of stereoscopic videocamera and head-mounted display (HMD) usually makes the surgeon motion sick or makes him/her move the head in awkward positions in order to keep the visual target in the centre of the video image. The system that has been developed will make the change of camera view more intuitive and less stressing. A user comparison test has been performed to compare the proposed HeadTracking system with a voice control system and a head control system, later referred to as the HeadCommand and VoiceCommand system respectively. The guided contour tracking system was implemented and tested on a MultiCraft 560 robot. The standard control system of MultiCraft 560 was extended with an active force control system which the contour tracking system was based on. The experiments showed that the system performed well and had a sufficient accuracy for the tested applications. The HeadTracking system was developed in four steps where different sensor signal interfaces to the robotic system were explored. The final prototype was implemented and tested on the robotic scopeholder Aesop3000DS. The performance of the system was sufficient and made the user feel like having the camera attached his/her nose when moving the head.