Vis enkel innførsel

dc.contributor.authorBoutros, Fadi
dc.contributor.authorDamer, Naser
dc.contributor.authorRaja, Kiran
dc.contributor.authorKirchbuchner, Florian
dc.contributor.authorKuijper, Arjan
dc.date.accessioned2023-02-28T08:36:51Z
dc.date.available2023-02-28T08:36:51Z
dc.date.created2022-11-02T14:23:47Z
dc.date.issued2022
dc.identifier.citationSensors. 2022, 22 (5), .en_US
dc.identifier.issn1424-8220
dc.identifier.urihttps://hdl.handle.net/11250/3054517
dc.description.abstractThis work addresses the challenge of building an accurate and generalizable periocular recognition model with a small number of learnable parameters. Deeper (larger) models are typically more capable of learning complex information. For this reason, knowledge distillation (kd) was previously proposed to carry this knowledge from a large model (teacher) into a small model (student). Conventional KD optimizes the student output to be similar to the teacher output (commonly classification output). In biometrics, comparison (verification) and storage operations are conducted on biometric templates, extracted from pre-classification layers. In this work, we propose a novel template-driven KD approach that optimizes the distillation process so that the student model learns to produce templates similar to those produced by the teacher model. We demonstrate our approach on intra- and cross-device periocular verification. Our results demonstrate the superiority of our proposed approach over a network trained without KD and networks trained with conventional (vanilla) KD. For example, the targeted small model achieved an equal error rate (EER) value of 22.2% on cross-device verification without KD. The same model achieved an EER of 21.9% with the conventional KD, and only 14.7% EER when using our proposed template-driven KD.en_US
dc.language.isoengen_US
dc.publisherMDPIen_US
dc.rightsNavngivelse 4.0 Internasjonal*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/deed.no*
dc.titleTemplate-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Modelsen_US
dc.title.alternativeTemplate-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Modelsen_US
dc.typePeer revieweden_US
dc.typeJournal articleen_US
dc.description.versionpublishedVersionen_US
dc.source.pagenumber0en_US
dc.source.volume22en_US
dc.source.journalSensorsen_US
dc.source.issue5en_US
dc.identifier.doi10.3390/s22051921
dc.identifier.cristin2068139
cristin.ispublishedtrue
cristin.fulltextoriginal
cristin.qualitycode1


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel

Navngivelse 4.0 Internasjonal
Med mindre annet er angitt, så er denne innførselen lisensiert som Navngivelse 4.0 Internasjonal