Show simple item record

dc.contributor.authorHarris, DJ
dc.contributor.authorWilson, MR
dc.contributor.authorJones, MI
dc.contributor.authorde Burgh, T
dc.contributor.authorMundy, D
dc.contributor.authorArthur, T
dc.contributor.authorOlonilua, M
dc.contributor.authorVine, SL
dc.date.accessioned2023-03-09T13:38:26Z
dc.date.issued2023-06-12
dc.date.updated2023-03-09T12:26:44Z
dc.description.abstractThe control of eye gaze is critical to the execution of many skills. The observation that task experts in many domains exhibit more efficient control of eye gaze than novices has led to the development of gaze training interventions that teach these behaviours. We aimed to extend this literature by i) examining the relative benefits of feed-forward (observing an expert’s eye movements) versus feed-back (observing your own eye movements) training, and ii) automating this training within virtual reality. Serving personnel from the British Army and Royal Navy were randomised to either feed-forward or feed-back training within a virtual reality simulation of a room search and clearance task. Eye movement metrics – including visual search, saccade direction, and entropy – were recorded to quantify the efficiency of visual search behaviours. Feed-forward and feed-back eye movement training produced distinct learning benefits, but both accelerated the development of efficient gaze behaviours. However, we found no evidence that these more efficient search behaviours transferred to better decision making in the room clearance task. Our results suggest integrating eye movement training principles within virtual reality training simulations may be effective, but further work is needed to understand the learning mechanisms.en_GB
dc.description.sponsorshipDefence Science and Technology Laboratory (DSTL)en_GB
dc.identifier.citationVol. 15 (3), article 7en_GB
dc.identifier.doi10.16910/jemr.15.3.7
dc.identifier.grantnumberHS1.010en_GB
dc.identifier.urihttp://hdl.handle.net/10871/132654
dc.identifierORCID: 0000-0003-3880-3856 (Harris, David)
dc.language.isoenen_GB
dc.publisherBern Open Publishingen_GB
dc.relation.urlhttps://osf.io/qn2g4/en_GB
dc.rights© 2023 David Harris, Mark Wilson, Martin Jones, Toby de Burgh, Daisy Mundy, Tom Arthur, Mayowa Olonilua, Samuel Vine. open access. This work is licensed under a Creative Commons Attribution 4.0 International License
dc.subjecteye trackingen_GB
dc.subjectVRen_GB
dc.subjectskill acquisitionen_GB
dc.subjectmilitaryen_GB
dc.subjecteye movementen_GB
dc.subjectdefenceen_GB
dc.titleAn investigation of feed-forward and feed-back eye movement training in immersive virtual realityen_GB
dc.typeArticleen_GB
dc.date.available2023-03-09T13:38:26Z
dc.identifier.issn1995-8692
dc.descriptionThis is the final version. Available on open access from Bern Open Publishing via the DOI in this recorden_GB
dc.descriptionData availability: All relevant data and code is available online from: https://osf.io/qn2g4/en_GB
dc.identifier.journalJournal of Eye Movement Researchen_GB
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_GB
dcterms.dateAccepted2023-03-09
dcterms.dateSubmitted2022-12-13
rioxxterms.versionVoRen_GB
rioxxterms.licenseref.startdate2023-03-09
rioxxterms.typeJournal Article/Reviewen_GB
refterms.dateFCD2023-03-09T12:26:46Z
refterms.versionFCDAM
refterms.dateFOA2023-08-15T13:24:58Z
refterms.panelAen_GB


Files in this item

This item appears in the following Collection(s)

Show simple item record

© 2023 David Harris, Mark Wilson, Martin Jones, Toby de Burgh, Daisy Mundy, Tom Arthur, Mayowa Olonilua, Samuel Vine. open access. This work is licensed under a Creative Commons Attribution 4.0 International License
Except where otherwise noted, this item's licence is described as © 2023 David Harris, Mark Wilson, Martin Jones, Toby de Burgh, Daisy Mundy, Tom Arthur, Mayowa Olonilua, Samuel Vine. open access. This work is licensed under a Creative Commons Attribution 4.0 International License