Show simple item record

dc.contributor.authorGhi, T
dc.contributor.authorConversano, Francesco
dc.contributor.authorRamirez Zegarra, R
dc.contributor.authorPisani, P
dc.contributor.authorDall'Asta, A
dc.contributor.authorLanzone, A
dc.contributor.authorLau, W
dc.contributor.authorVimercati, A
dc.contributor.authorIliescu, DG
dc.contributor.authorMappa, I
dc.contributor.authorRizzo, G
dc.contributor.authorCasciaro, S
dc.contributor.authorEggebø, Torbjørn Moe
dc.date.accessioned2024-07-15T09:14:33Z
dc.date.available2024-07-15T09:14:33Z
dc.date.created2023-01-09T13:33:11Z
dc.date.issued2022
dc.identifier.citationUltrasound in Obstetrics and Gynecology. 2022, 59 (1), 93-99.en_US
dc.identifier.issn0960-7692
dc.identifier.issn1469-0705
dc.identifier.urihttps://hdl.handle.net/11250/3141240
dc.description.abstractObjectives: To describe a newly developed machine-learning (ML) algorithm for the automatic recognition of fetal head position using transperineal ultrasound (TPU) during the second stage of labor and to describe its performance in differentiating between occiput anterior (OA) and non-OA positions. Methods: This was a prospective cohort study including singleton term (> 37 weeks of gestation) pregnancies in the second stage of labor, with a non-anomalous fetus in cephalic presentation. Transabdominal ultrasound was performed to determine whether the fetal head position was OA or non-OA. For each case, one sonographic image of the fetal head was then acquired in an axial plane using TPU and saved for later offline analysis. Using the transabdominal sonographic diagnosis as the gold standard, a ML algorithm based on a pattern-recognition feed-forward neural network was trained on the TPU images to discriminate between OA and non-OA positions. In the training phase, the model tuned its parameters to approximate the training data (i.e. the training dataset) such that it would identify correctly the fetal head position, by exploiting geometric, morphological and intensity-based features of the images. In the testing phase, the algorithm was blinded to the occiput position as determined by transabdominal ultrasound. Using the test dataset, the ability of the ML algorithm to differentiate OA from non-OA fetal positions was assessed in terms of diagnostic accuracy. The F1-score and precision-recall area under the curve (PR-AUC) were calculated to assess the algorithm's performance. Cohen's kappa (κ) was calculated to evaluate the agreement between the algorithm and the gold standard. Results: Over a period of 24 months (February 2018 to January 2020), at 15 maternity hospitals affiliated to the International Study group on Labor ANd Delivery Sonography (ISLANDS), we enrolled into the study 1219 women in the second stage of labor. On the basis of transabdominal ultrasound, they were classified as OA (n = 801 (65.7%)) or non-OA (n = 418 (34.3%)). From the entire cohort (OA and non-OA), approximately 70% (n = 824) of the patients were assigned randomly to the training dataset and the rest (n = 395) were used as the test dataset. The ML-based algorithm correctly classified the fetal occiput position in 90.4% (357/395) of the test dataset, including 224/246 with OA (91.1%) and 133/149 with non-OA (89.3%) fetal head position. Evaluation of the algorithm's performance gave an F1-score of 88.7% and a PR-AUC of 85.4%. The algorithm showed a balanced performance in the recognition of both OA and non-OA positions. The robustness of the algorithm was confirmed by high agreement with the gold standard (κ = 0.81; P < 0.0001). Conclusions: This newly developed ML-based algorithm for the automatic assessment of fetal head position using TPU can differentiate accurately, in most cases, between OA and non-OA positions in the second stage of labor. This algorithm has the potential to support not only obstetricians but also midwives and accoucheurs in the clinical use of TPU to determine fetal occiput position in the labor ward.en_US
dc.language.isoengen_US
dc.publisherWileyen_US
dc.titleNovel artificial intelligence approach for automatic differentiation of fetal occiput anterior and non-occiput anterior positions during laboren_US
dc.title.alternativeNovel artificial intelligence approach for automatic differentiation of fetal occiput anterior and non-occiput anterior positions during laboren_US
dc.typeJournal articleen_US
dc.typePeer revieweden_US
dc.description.versionpublishedVersionen_US
dc.rights.holderThis version of the article is not available due to the publisher copyright restrictions.en_US
dc.source.pagenumber93-99en_US
dc.source.volume59en_US
dc.source.journalUltrasound in Obstetrics and Gynecologyen_US
dc.source.issue1en_US
dc.identifier.doi10.1002/uog.23739
dc.identifier.cristin2103300
cristin.ispublishedtrue
cristin.fulltextoriginal
cristin.qualitycode2


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record