Show simple item record

dc.contributor.authorKianpour, Mazaher
dc.contributor.authorWen, Shao-Fang
dc.date.accessioned2021-09-07T11:22:12Z
dc.date.available2021-09-07T11:22:12Z
dc.date.created2019-10-30T12:04:30Z
dc.date.issued2020
dc.identifier.citationAdvances in Intelligent Systems and Computing. 2020, 1037 111-125.en_US
dc.identifier.issn2194-5357
dc.identifier.urihttps://hdl.handle.net/11250/2774001
dc.description.abstractMachine learning plays a significant role in today’s business sectors and governments, in which it is becoming more utilized as tools to help in decision making and automation process. However, these tools are not inherently robust and secure, and could be vulnerable to adversarial modification and cause false classification or risk in the system security. As such, the field of adversarial machine learning has emerged to study vulnerabilities of machine learning models and algorithms, and make them secure against adversarial manipulation. In this paper, we present the recently proposed taxonomy for attacks on machine learning and draw distinctions between other taxonomies. Moreover, this paper brings together the state of the art in theory and practice needed for decision timing attacks on machine learning and defense strategies against them. Considering the increasing research interest in this field, we hope this study provides readers with the essential knowledge to successfully engage in research and practice of machine learning in adversarial environment.en_US
dc.language.isoengen_US
dc.publisherSpringeren_US
dc.titleTiming Attacks on Machine Learning: State of the Arten_US
dc.typePeer revieweden_US
dc.typeJournal articleen_US
dc.description.versionpublishedVersionen_US
dc.source.pagenumber111-125en_US
dc.source.volume1037en_US
dc.source.journalAdvances in Intelligent Systems and Computingen_US
dc.identifier.doi10.1007/978-3-030-29516-5_10
dc.identifier.cristin1742193
dc.description.localcodeThis article will not be available due to copyright restrictions (c) 2020 by Springeren_US
cristin.ispublishedtrue
cristin.fulltextoriginal
cristin.qualitycode1


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record