Show simple item record

dc.contributor.authorSmistad, Erik
dc.contributor.authorØstvik, Andreas
dc.contributor.authorPedersen, Andrè
dc.description.abstractDeep convolutional neural networks have quickly become the standard for medical image analysis. Although there are many frameworks focusing on training neural networks, there are few that focus on high performance inference and visualization of medical images. Neural network inference requires an inference engine (IE), and there are currently several IEs available including Intel’s OpenVINO, NVIDIA’s TensorRT, and Google’s TensorFlow which supports multiple backends, including NVIDIA’s cuDNN, AMD’s ROCm and Intel’s MKL-DNN. These IEs only work on specific processors and have completely different application programming interfaces (APIs). In this paper, we presents methods for extending FAST, an open-source high performance framework for medical imaging, to use any IE with a common programming interface. Thereby making it easier for users to deploy and test their neural networks on different processors. This article provides an overview of current IEs and how they can be combined with existing software such as FAST. The methods are demonstrated and evaluated on three performance demanding medical use cases: real-time ultrasound image segmentation, computed tomography (CT) volume segmentation, and patch-wise classification of whole slide microscopy images. Runtime performance was measured on the three use cases with several different IEs and processors. This revealed that the choice of IE and processor can affect performance of medical neural network image analysis considerably. In the most extreme case of processing 171 ultrasound frames, the difference between the fastest and slowest configuration were half a second vs. 24 seconds. For volume processing, using the CPU or the GPU, showed a difference of 2 vs. 53 seconds, and for processing an whole slide microscopy image, the difference was 81 seconds vs. almost 16 minutes. Source code, binary releases and test data can be found online on GitHub at .nb_NO
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)nb_NO
dc.rightsNavngivelse 4.0 Internasjonal*
dc.titleHigh Performance Neural Network Inference, Streaming, and Visualization of Medical Images Using FASTnb_NO
dc.typeJournal articlenb_NO
dc.typePeer reviewednb_NO
dc.source.journalIEEE Accessnb_NO
dc.relation.projectNorges forskningsråd: 270941nb_NO
dc.description.localcodeThis work is licensed under a Creative Commons Attribution 4.0 License. For more information, see
cristin.unitnameInstitutt for sirkulasjon og bildediagnostikk

Files in this item


This item appears in the following Collection(s)

Show simple item record

Navngivelse 4.0 Internasjonal
Except where otherwise noted, this item's license is described as Navngivelse 4.0 Internasjonal