Vis enkel innførsel

dc.contributor.authorMuri, Harald Ian
dc.contributor.authorHjelme, Dag Roar
dc.date.accessioned2022-02-23T15:20:31Z
dc.date.available2022-02-23T15:20:31Z
dc.date.created2021-03-17T12:24:36Z
dc.date.issued2021
dc.identifier.issn0277-786X
dc.identifier.urihttps://hdl.handle.net/11250/2981092
dc.description.abstractMinimization of the environmental impact of the incineration process and to produce energy efficiently are the most important considerations in obtaining efficient operation of waste-to-energy (WtE) plants. WtE operation can obtain significant improvements by predicting combustion properties of municipal solid waste (MSW) prior to incineration. Combustion properties of MSW can be assessed by estimating the weighted waste fractions such as paper and cardboard, plastic or inert and fines. Waste materials and fractions can be recognized using imaging techniques and image classification methods based on deep convolution neural networks (CNN). We have tested a new sensor system for image classification based on using multispectral (MS) images and deep CNN pretrained on the ImageNet database to recognize MSW categories. MS camera was used for sampling images above the walking floor of a WtE plant (StatKraft Varme Tiller, Trondheim). The waste load was automatically registered as industrial or household waste at the time of delivery. The MS images from 49 waste loads were used to perform transfer-learning on EfficientNetB0 model weighted with ImageNet-NoisyStudent parameters. Using the predefined classes, a test and training set were generated from the 49 waste loads delivered between June and September (2020). The training set consisted of 35 waste loads while the remaining 14 waste loads were used as test. The weights for the image feature extraction was constant during training while the fully connected layer (top-layer) was updated for each epoch. The model performance on the test set was assessed by making predictions on the household or industrial waste images. With a fixed threshold value at 0.5, the model showed 85% accuracy, 92% precision, 89% recall and 90% F-measure for industrial class, while for household class the model showed 80% accuracy, 94% precision, 81% recall and 87% F-measure. For all threshold values, the area under curve estimated from the receiver operating characteristic plot showed that the model has 87% confidence in distinguishing household waste images from industrial waste images and 90% confidence in distinguishing industrial waste images from household waste images.en_US
dc.language.isoengen_US
dc.publisherSociety of Photo Optical Instrumentation Engineers (SPIE)en_US
dc.titleClassification of municipal solid waste using deep convolutional neural network model applied to multispectral imagesen_US
dc.typePeer revieweden_US
dc.typeJournal articleen_US
dc.description.versionacceptedVersionen_US
dc.rights.holder© Society of Photo Optical Instrumentation Engineers. One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited.en_US
dc.source.journalProceedings of SPIE, the International Society for Optical Engineeringen_US
dc.identifier.doihttps://doi.org/10.1117/12.2590224
dc.identifier.cristin1898625
dc.relation.projectNorges forskningsråd: 280949 - Waste-to-Energy 2030en_US
cristin.ispublishedtrue
cristin.fulltextpostprint
cristin.qualitycode1


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel