Show simple item record

dc.contributor.authorTroscianko, J
dc.contributor.authorOsorio, D
dc.date.accessioned2023-10-13T10:09:10Z
dc.date.issued2023-06-15
dc.date.updated2023-10-13T08:54:41Z
dc.description.abstractAn object's colour, brightness and pattern are all influenced by its surroundings, and a number of visual phenomena and "illusions" have been discovered that highlight these often dramatic effects. Explanations for these phenomena range from low-level neural mechanisms to high-level processes that incorporate contextual information or prior knowledge. Importantly, few of these phenomena can currently be accounted for in quantitative models of colour appearance. Here we ask to what extent colour appearance is predicted by a model based on the principle of coding efficiency. The model assumes that the image is encoded by noisy spatio-chromatic filters at one octave separations, which are either circularly symmetrical or oriented. Each spatial band's lower threshold is set by the contrast sensitivity function, and the dynamic range of the band is a fixed multiple of this threshold, above which the response saturates. Filter outputs are then reweighted to give equal power in each channel for natural images. We demonstrate that the model fits human behavioural performance in psychophysics experiments, and also primate retinal ganglion responses. Next, we systematically test the model's ability to qualitatively predict over 50 brightness and colour phenomena, with almost complete success. This implies that much of colour appearance is potentially attributable to simple mechanisms evolved for efficient coding of natural images, and is a well-founded basis for modelling the vision of humans and other animals.en_GB
dc.description.sponsorshipNatural Environment Research Council (NERC)en_GB
dc.identifier.citationVol. 19, No. 6, article e1011117en_GB
dc.identifier.doihttps://doi.org/10.1371/journal.pcbi.1011117
dc.identifier.grantnumberNE/ P018084/1en_GB
dc.identifier.urihttp://hdl.handle.net/10871/134237
dc.identifierORCID: 0000-0001-9071-2594 (Troscianko, Jolyon)
dc.language.isoenen_GB
dc.publisherPublic Library of Science (PLoS)en_GB
dc.relation.urlhttps://www.biorxiv.org/content/biorxiv/early/2022/02/23/2022.02.22.481414/DC1/en_GB
dc.relation.urlhttps://www.ncbi.nlm.nih.gov/pubmed/37319266en_GB
dc.rights© 2023 Troscianko, Osorio. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.en_GB
dc.subjectColouren_GB
dc.subjectColour Perceptionen_GB
dc.subjectRetinal Ganglion Cellsen_GB
dc.subjectContrast Sensitivityen_GB
dc.subjectPsychophysicsen_GB
dc.titleA model of colour appearance based on efficient coding of natural imagesen_GB
dc.typeArticleen_GB
dc.date.available2023-10-13T10:09:10Z
dc.identifier.issn1553-734X
exeter.article-numbere1011117
exeter.place-of-publicationUnited States
dc.descriptionThis is the final version. Available on open access from Public Library of Science via the DOI in this record. en_GB
dc.descriptionData Availability Statement: An implementation of the SBL model is provided here: https://www.biorxiv.org/content/biorxiv/early/2022/02/23/2022.02.22.481414/DC1/en_GB
dc.identifier.eissn1553-7358
dc.identifier.journalPLoS Computational Biologyen_GB
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_GB
dcterms.dateAccepted2023-04-20
rioxxterms.versionVoRen_GB
rioxxterms.licenseref.startdate2023-06-15
rioxxterms.typeJournal Article/Reviewen_GB
refterms.dateFCD2023-10-13T10:06:49Z
refterms.versionFCDVoR
refterms.dateFOA2023-10-13T10:09:11Z
refterms.panelAen_GB
refterms.dateFirstOnline2023-06-15


Files in this item

This item appears in the following Collection(s)

Show simple item record

© 2023 Troscianko, Osorio. This is an
open access article distributed under the terms of
the Creative Commons Attribution License, which
permits unrestricted use, distribution, and
reproduction in any medium, provided the original
author and source are credited.
Except where otherwise noted, this item's licence is described as © 2023 Troscianko, Osorio. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.