Vis enkel innførsel

dc.contributor.authorGeyer, Fabien
dc.contributor.authorBondorf, Steffen
dc.date.accessioned2020-01-15T13:04:12Z
dc.date.available2020-01-15T13:04:12Z
dc.date.created2019-08-05T11:08:50Z
dc.date.issued2019
dc.identifier.citationIEEE Infocom. Proceedings. 2019, 2019-April 1009-1017.nb_NO
dc.identifier.issn0743-166X
dc.identifier.urihttp://hdl.handle.net/11250/2636441
dc.description.abstractNetwork calculus computes end-to-end delay bounds for individual data flows in networks of aggregate schedulers. It searches for the best model bounding resource contention between these flows at each scheduler. Analyzing networks, this leads to complex dependency structures and finding the tightest delay bounds becomes a resource intensive task. The exhaustive search for the best combination of contention models is known as Tandem Matching Analysis (TMA). The challenge TMA overcomes is that a contention model in one location of the network can have huge impact on one in another location. These locations can, however, be many analysis steps apart from each other. TMA can derive delay bounds with high degree of tightness but needs several hours of computations to do so. We avoid the effort of exhaustive search altogether by predicting the best contention models for each location in the network. For effective predictions, our main contribution in this paper is a novel framework combining graph-based deep learning and Network Calculus (NC) models. The framework learns from NC, predicts best NC models and feeds them back to NC. Deriving a first heuristic from this framework, called DeepTMA, we achieve provably valid bounds that are very competitive with TMA. We observe a maximum relative error below 6%, while execution times remain nearly constant and outperform TMA in moderately sized networks by several orders of magnitude.nb_NO
dc.language.isoengnb_NO
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)nb_NO
dc.titleDeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networksnb_NO
dc.typeJournal articlenb_NO
dc.typePeer reviewednb_NO
dc.description.versionacceptedVersionnb_NO
dc.source.pagenumber1009-1017nb_NO
dc.source.volume2019-Aprilnb_NO
dc.source.journalIEEE Infocom. Proceedingsnb_NO
dc.identifier.doi10.1109/INFOCOM.2019.8737496
dc.identifier.cristin1714003
dc.description.localcode© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.nb_NO
cristin.unitcode194,63,30,0
cristin.unitnameInstitutt for informasjonssikkerhet og kommunikasjonsteknologi
cristin.ispublishedtrue
cristin.fulltextpostprint
cristin.qualitycode2


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel