dc.contributor.author | De Ath, G | |
dc.contributor.author | Everson, RM | |
dc.contributor.author | Fieldsend, JE | |
dc.date.accessioned | 2021-05-04T08:26:23Z | |
dc.date.issued | 2021-07-07 | |
dc.description.abstract | Bayesian optimisation (BO) uses probabilistic surrogate models - usually Gaussian processes (GPs) - for the optimisation of expensive black-box functions. At each BO iteration, the GP hyperparameters are fit to previously-evaluated data by maximising the marginal likelihood. However, this fails to account for uncertainty in the hyperparameters themselves, leading to overconfident model predictions. This uncertainty can be accounted for by taking the Bayesian approach of marginalising out the model hyperparameters. We investigate whether a fully-Bayesian treatment of the Gaussian process hyperparameters in BO (FBBO) leads to improved optimisation performance. Since an analytic approach is intractable, we compare FBBO using three approximate inference schemes to the maximum likelihood approach, using the Expected Improvement (EI) and Upper Confidence Bound (UCB) acquisition functions paired with ARD and isotropic Matern kernels, across 15 well-known benchmark problems for 4 observational noise settings. FBBO using EI with an ARD kernel leads to the best performance in the noise-free setting, with much less difference between combinations of BO components when the noise is increased. FBBO leads to over-exploration with UCB, but is not detrimental with EI. Therefore, we recommend that FBBO using EI with an ARD kernel as the default choice for BO. | en_GB |
dc.description.sponsorship | Innovate UK | en_GB |
dc.identifier.citation | GECCO '21: Proceedings of the 2021 Genetic and Evolutionary Computation Conference, 10 - 14 July 2021, Lille, France, pp. 1860 – 1869 | en_GB |
dc.identifier.doi | 10.1145/3449726.3463164 | |
dc.identifier.grantnumber | 104400 | en_GB |
dc.identifier.grantnumber | 105874 | en_GB |
dc.identifier.uri | http://hdl.handle.net/10871/125532 | |
dc.language.iso | en | en_GB |
dc.publisher | Association for Computing Machinery (ACM) | en_GB |
dc.rights | © 2021. Copyright held by the owner/author(s). Publication rights licensed to ACM.
This is the author’s version of the work. It is posted here for your personal use. Not
for redistribution. | en_GB |
dc.subject | Bayesian optimisation | en_GB |
dc.subject | Surrogate modelling | en_GB |
dc.subject | Gaussian process | en_GB |
dc.subject | Approximate inference | en_GB |
dc.title | How Bayesian Should Bayesian Optimisation Be? | en_GB |
dc.type | Conference paper | en_GB |
dc.date.available | 2021-05-04T08:26:23Z | |
dc.identifier.isbn | 978-1-4503-8351-6 | |
dc.description | This is the author accepted manuscript. The final version is available from ACM via the DOI in this record | en_GB |
dc.rights.uri | http://www.rioxx.net/licenses/all-rights-reserved | en_GB |
pubs.funder-ackownledgement | Yes | en_GB |
dcterms.dateAccepted | 2021-04-26 | |
exeter.funder | ::Innovate UK | en_GB |
exeter.funder | ::Innovate UK | en_GB |
rioxxterms.version | AM | en_GB |
rioxxterms.licenseref.startdate | 2021-04-26 | |
rioxxterms.type | Conference Paper/Proceeding/Abstract | en_GB |
refterms.dateFCD | 2021-05-03T14:19:59Z | |
refterms.versionFCD | AM | |
refterms.dateFOA | 2021-07-12T13:27:39Z | |
refterms.panel | B | en_GB |