Vis enkel innførsel

dc.contributor.advisorBlake, Richard E.nb_NO
dc.contributor.authorKarlsen, Jørn Skaarudnb_NO
dc.date.accessioned2014-12-19T13:33:00Z
dc.date.available2014-12-19T13:33:00Z
dc.date.created2010-09-03nb_NO
dc.date.issued2005nb_NO
dc.identifier348070nb_NO
dc.identifierntnudaim:1009nb_NO
dc.identifier.urihttp://hdl.handle.net/11250/250934
dc.description.abstractIntra-operative Magnetic Resonance Imaging is a new modality for image-guided therapy, and Augmented Reality (AR) is an important emerging technology in this field. AR enables the development of tools which can be applied both pre-operatively and intra-operatively, thus helping users to see into the body, through organs and visualize the relevant parts useful for a specific procedure. The work presented in this paper aims at solving several problems in order to develop an Augmented Reality system for real-life surgery in an MR environment. Specifically, ways of correctly registering 3D-imagery with the real world is the major problem of both Augmented Reality and this thesis. Emphasis is put on the static registration problem. Subproblems of this include: calibrating a video-see-through Head Mounted Display (HMD) entirely in Augmented Reality, registering a virtual object on a patient by placing a set of points on both the virtual object and patient, and calculating the transformation needed in order for two overlapping tracking systems to deliver tracking signals in the same coordinate system. Additionally, problems and solutions related to the visualization of volume data and internal organs are presented: Specifically, how to view virtual organs as if they were residing inside the body of a patient through a cut, thought no surgical opening of the body has been performed, and the visualization and manipulation of a volume transfer function in a real-time Augmented Reality setting. Implementations use the Studierstube and OpenTracker software frameworks for visualization and abstraction of tracking devices respectively. OpenCV, a computer vision library, is used for image processing and calibraton together with an implementation of Tsai's calibration method by Reg Willson. The Augmented Reality based calibration implementation uses two different calibration methods, referred to in litterature as Zhang and Tsai camera calibration, for calibrating the intrinsic and extrinsic camera parameters respectively. Registering virtual-real objects and overlapping tracking systems is performed using a simplified version of the Iterative Closest Point (ICP) procedure solving a problem commonly referred to as the absolute orientation problem. The virtual-cut implementation works by projecting a rendered texture of a virtual organ and mapping this to a mesh representation of a cut which is placed on the patient in Augmented Reality. The volume transfer functions are implemented as Catmull-Rom curves, and have control points which are movable in Augmented Reality. Histograms represent transfer functions as well as distribution of volume intensities. Results show that the Augmented Reality based camera calibration procedure suffers from inaccuracies in the sampling of points for extrinsic camera calibration due to the dynamics present when wearing an HMD and holding a tracked pen. This type of calibration should occur by sampling statically and averaging over several samples to reduce noise. The virtual real and overlapping tracking systems are also sensitive to sampling, and care has to be taken in order to do this accurately. The virtual-cut technique has been shown to increase the feeling of a virtual object residing within the body of a patient, and the volume transfer function became easier to use after implementing the histogram visualization, reducing the time needed to set up a transfer function. There are many issues which need to be solved in order to set up a useful medical Augmented Reality implementation. This thesis attempts to illustrate some of these problems, and introduces solutions to a few. Further developments are needed in order to bring the results from this paper into a clinical setting, but the possibilities are many if such an integration is achieved.nb_NO
dc.languageengnb_NO
dc.publisherInstitutt for datateknikk og informasjonsvitenskapnb_NO
dc.subjectntnudaimno_NO
dc.subjectSIF2 datateknikkno_NO
dc.subjectProgram- og informasjonssystemerno_NO
dc.titleAugmented Reality for MR-guided surgerynb_NO
dc.typeMaster thesisnb_NO
dc.source.pagenumber127nb_NO
dc.contributor.departmentNorges teknisk-naturvitenskapelige universitet, Fakultet for informasjonsteknologi, matematikk og elektroteknikk, Institutt for datateknikk og informasjonsvitenskapnb_NO


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel