Show simple item record

dc.contributor.authorHarris, DJ
dc.contributor.authorArthur, T
dc.contributor.authorVine, SJ
dc.contributor.authorLiu, J
dc.contributor.authorAbd Rahman, HR
dc.contributor.authorHan, F
dc.contributor.authorWilson, MR
dc.date.accessioned2023-01-06T09:49:18Z
dc.date.issued2022-12-21
dc.date.updated2023-01-06T09:30:34Z
dc.description.abstractIn this study, we examined the relationship between physiological encoding of surprise and the learning of anticipatory eye movements. Active inference portrays perception and action as interconnected inference processes, driven by the imperative to minimise the surprise of sensory observations. To examine this characterisation of oculomotor learning during a hand-eye coordination task, we tested whether anticipatory eye movements were updated in accordance with Bayesian principles and whether trial-by-trial learning rates tracked pupil dilation as a marker of 'surprise'. Forty-four participants completed an interception task in immersive virtual reality that required them to hit bouncing balls that had either expected or unexpected bounce profiles. We recorded anticipatory eye movements known to index participants' beliefs about likely ball bounce trajectories. By fitting a hierarchical Bayesian inference model to the trial-wise trajectories of these predictive eye movements, we were able to estimate each individual's expectations about bounce trajectories, rates of belief updating, and precision-weighted prediction errors. We found that the task-evoked pupil response tracked prediction errors and learning rates but not beliefs about ball bounciness or environmental volatility. These findings are partially consistent with active inference accounts and shed light on how encoding of surprise may shape the control of action.en_GB
dc.description.sponsorshipLeverhulme Trusten_GB
dc.identifier.citationVol. 12, article 22098en_GB
dc.identifier.doihttps://doi.org/10.1038/s41598-022-26544-w
dc.identifier.urihttp://hdl.handle.net/10871/132155
dc.identifierORCID: 0000-0003-3880-3856 (Harris, DJ)
dc.identifierORCID: 0000-0001-9329-1262 (Vine, SJ)
dc.identifierORCID: 0000-0001-8145-6971 (Wilson, MR)
dc.language.isoenen_GB
dc.publisherNature Researchen_GB
dc.relation.urlhttps://www.ncbi.nlm.nih.gov/pubmed/36543845en_GB
dc.relation.urlhttps://osf.io/z96q8/en_GB
dc.rights© The Author(s) 2022. Open Access. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. Te images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.en_GB
dc.subjectHumansen_GB
dc.subjectPupilen_GB
dc.subjectBayes Theoremen_GB
dc.subjectLearningen_GB
dc.subjectEye Movementsen_GB
dc.subjectVirtual Realityen_GB
dc.subjectPsychomotor Performanceen_GB
dc.titleTask-evoked pupillary responses track precision-weighted prediction errors and learning rate during interceptive visuomotor actions.en_GB
dc.typeArticleen_GB
dc.date.available2023-01-06T09:49:18Z
dc.identifier.issn2045-2322
exeter.article-number22098
exeter.place-of-publicationEngland
dc.descriptionThis is the final version. Available on open access from Nature Research via the DOI in this record.en_GB
dc.descriptionData availability: All relevant data and code are available online from: https://osf.io/z96q8/en_GB
dc.identifier.journalScientific Reportsen_GB
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_GB
dcterms.dateAccepted2022-12-15
rioxxterms.versionVoRen_GB
rioxxterms.licenseref.startdate2022-12-21
rioxxterms.typeJournal Article/Reviewen_GB
refterms.dateFCD2023-01-06T09:43:39Z
refterms.versionFCDVoR
refterms.dateFOA2023-01-06T09:50:16Z
refterms.panelAen_GB
refterms.dateFirstOnline2022-12-21


Files in this item

This item appears in the following Collection(s)

Show simple item record

© The Author(s) 2022. Open Access. This article is licensed under a Creative Commons Attribution 4.0 International
License, which permits use, sharing, adaptation, distribution and reproduction in any medium or
format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the
Creative Commons licence, and indicate if changes were made. Te images or other third party material in this
article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the
material. If material is not included in the article’s Creative Commons licence and your intended use is not
permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from
the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Except where otherwise noted, this item's licence is described as © The Author(s) 2022. Open Access. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. Te images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.