Vis enkel innførsel

dc.contributor.advisorTrémeau, Alain
dc.contributor.advisorMuselet, Damien
dc.contributor.advisorTorres, Cindy
dc.contributor.advisorRobert, Olivier
dc.contributor.authorDíaz Estrada, David Norman
dc.date.accessioned2022-10-01T17:24:17Z
dc.date.available2022-10-01T17:24:17Z
dc.date.issued2022
dc.identifierno.ntnu:inspera:118516831:67608993
dc.identifier.urihttps://hdl.handle.net/11250/3023082
dc.descriptionFull text not available
dc.description.abstract
dc.description.abstractDue to the increase in demand for agricultural products and the necessity to boost the advancements in phenotyping and breeding methods to improve the quality of the products, precise detection and segmentation methods are needed to automatically measure relevant traits of fruits and vegetable crops. Every year new typologies and varieties are developed and introduced to the agri-food sector, where Deep Learning based Segmentation methods play a key role in modern phenotyping processes, with successful implementations that aid breeders to accelerate the process. Nonetheless, the annotation work required to create the datasets to train these Deep models is time-consuming, tedious, and constitutes a bottleneck given the changes in varieties of fruits every year. In this work we study the possibility of leveraging the previous work and existing annotations from a given species for the benefit of another, in this context, we focus on cucumbers and carrots in an uncontrolled environment. For this purpose, in this work we developed a CycleGAN pipeline that effectively transforms cucumbers into carrots and creates a synthetic carrot dataset for instance segmentation. Moreover, our experiments show that in the case of translation between cucumber and carrot, CycleGAN is able to learn not only the color and texture but also to change the shape of both domains during translation, thus, generating realistic carrots and cucumbers. We also show that it is possible to obtain realistic results training CycleGAN with just 250 original annotations per domain. Furthermore, we have developed a carrot segmentation model under uncontrolled conditions, using Mask R-CNN with RestnetXt-101 as the backbone. Where the highest F1 score was 93% on validation images of real carrots, achieved by the models that incorporate fake carrot data during training, and the best AP of 74.5% in test images. Moreover, we have tested our proposed method for pepper2squash, which validates the robustness of our CycleGAN pipeline and the possibility to use it for the benefit between other species for the creation of synthetic instance segmentation datasets and in different image-to-image translation scenarios.
dc.languageeng
dc.publisherNTNU
dc.titleUse of transfer learning and GANs to create a carrot segmentation model from work done on cucumbers, in uncontrolled conditions
dc.typeMaster thesis


Tilhørende fil(er)

FilerStørrelseFormatVis

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel