Show simple item record

dc.contributor.authorYin, X
dc.contributor.authorRuan, W
dc.contributor.authorFieldsend, JE
dc.date.accessioned2024-05-29T12:05:40Z
dc.date.issued2022-10-31
dc.date.updated2024-05-29T11:02:28Z
dc.description.abstractThe adversarial attack can force a CNN-based model to produce an incorrect output by craftily manipulating human-imperceptible input. Exploring such perturbations can help us gain a deeper understanding of the vulnerability of neural networks, and provide robustness to deep learning against miscellaneous adversaries. Despite extensive studies focusing on the robustness of image, audio, and NLP, works on adversarial examples of visual object tracking—especially in a black-box manner—are quite lacking. In this paper, we propose a novel adversarial attack method to generate noises for single object tracking under black-box settings, where perturbations are merely added on initialized frames of tracking sequences, which is difcult to be noticed from the perspective of a whole video clip. Specifcally, we divide our algorithm into three components and exploit reinforcement learning for localizing important frame patches precisely while reducing unnecessary computational queries overhead. Compared to existing techniques, our method requires less time to perturb videos, but to manipulate competitive or even better adversarial performance. We test our algorithm in both long-term and short-term datasets, including OTB100, VOT2018, UAV123, and LaSOT. Extensive experiments demonstrate the efectiveness of our method on three mainstream types of trackers: discrimination, Siamese-based, and reinforcement learning-based trackers. We release our attack tool, DIMBA, via GitHub https://github. com/TrustAI/DIMBA for use by the community.en_GB
dc.description.sponsorshipEngineering and Physical Sciences Research Council (EPSRC)en_GB
dc.format.extent1705-1723
dc.identifier.citationVol. 113, article 4en_GB
dc.identifier.doihttps://doi.org/10.1007/s10994-022-06252-2
dc.identifier.grantnumberEP/ R026173/1en_GB
dc.identifier.urihttp://hdl.handle.net/10871/136061
dc.identifierORCID: 0000-0002-0683-2583 (Fieldsend, Jonathan E)
dc.language.isoenen_GB
dc.publisherSpringeren_GB
dc.relation.urlhttps://github.com/TrustAI/DIMBAen_GB
dc.rights© The Author(s) 2022. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.en_GB
dc.subjectVisual object trackingen_GB
dc.subjectAdversarial exampleen_GB
dc.subjectBlack-box attacken_GB
dc.titleDIMBA: discretely masked black-box attack in single object trackingen_GB
dc.typeArticleen_GB
dc.date.available2024-05-29T12:05:40Z
dc.identifier.issn0885-6125
exeter.article-number4
dc.descriptionThis is the final version. Available from on open access from Springer via the DOI in this record. en_GB
dc.descriptionData availability: Our code is available on https://github.com/TrustAI/DIMBA.en_GB
dc.identifier.eissn1573-0565
dc.identifier.journalMachine Learningen_GB
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_GB
dcterms.dateAccepted2022-09-19
rioxxterms.versionVoRen_GB
rioxxterms.licenseref.startdate2022-10-31
rioxxterms.typeJournal Article/Reviewen_GB
refterms.dateFCD2024-05-29T12:00:21Z
refterms.versionFCDVoR
refterms.dateFOA2024-05-29T12:06:05Z
refterms.panelBen_GB
refterms.dateFirstOnline2022-10-31


Files in this item

This item appears in the following Collection(s)

Show simple item record

© The Author(s) 2022. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Except where otherwise noted, this item's licence is described as © The Author(s) 2022. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.