Show simple item record

dc.contributor.authorChugh, T
dc.contributor.authorRahat, A
dc.contributor.authorPalar, PS
dc.date.accessioned2019-06-26T08:04:59Z
dc.date.issued2020-01-03
dc.description.abstractGaussian processes (GPs) belong to a class of probabilistic techniques that have been successfully used in different domains of machine learning and optimization. They are popular because they provide uncertainties in predictions, which sets them apart from other modelling methods providing only point predictions. The uncertainty is particularly useful for decision making as we can gauge how reliable a prediction is. One of the fundamental challenges in using GPs is that the efficacy of a model is conferred by selecting an appropriate kernel and the associated hyperparameter values for a given problem. Furthermore, the training of GPs, that is optimizing the hyperparameters using a data set is traditionally performed using a cost function that is a weighted sum of data fit and model complexity, and the underlying trade-off is completely ignored. Addressing these challenges and shortcomings, in this article, we propose the following automated training scheme. Firstly, we use a weighted product of multiple kernels with a view to relieve the users from choosing an appropriate kernel for the problem at hand without any domain specific knowledge. Secondly, for the first time, we modify GP training by using a multi-objective optimizer to tune the hyperparameters and weights of multiple kernels and extract an approximation of the complete trade-off front between data-fit and model complexity. We then propose to use a novel solution selection strategy based on mean standardized log loss (MSLL) to select a solution from the estimated trade-off front and finalise training of a GP model. The results on three data sets and comparison with the standard approach clearly show the potential benefit of the proposed approach of using multi-objective optimization with multiple kernels.en_GB
dc.description.sponsorshipNatural Environment Research Council (NERC)en_GB
dc.identifier.citationVol. 11943, pp. 579-591en_GB
dc.identifier.doi10.1007/978-3-030-37599-7_48
dc.identifier.grantnumberNE/P017436/1en_GB
dc.identifier.urihttp://hdl.handle.net/10871/37682
dc.language.isoenen_GB
dc.publisherSpringer Verlagen_GB
dc.rights© Springer Nature Switzerland AG 2019
dc.subjectKrigingen_GB
dc.subjectBayesian optimisationen_GB
dc.subjectmulti-objective optimisationen_GB
dc.subjectmodel selectionen_GB
dc.titleTrading-off Data Fit and Complexity in Training Gaussian Processes with Multiple Kernelsen_GB
dc.typeConference paperen_GB
dc.date.available2019-06-26T08:04:59Z
dc.contributor.editorNicosia, Gen_GB
dc.contributor.editorPardalos, Pen_GB
dc.contributor.editorGiuffrida, Gen_GB
dc.contributor.editorUmeton, Ren_GB
dc.contributor.editorSciacca, Ven_GB
dc.identifier.issn0302-9743
dc.descriptionThis is the author accepted manuscript. The final version is available from Springer Verlag via the DOI in this recorden_GB
dc.descriptionLOD 2019: Fifth International Conference on Machine Learning, Optimization, and Data Science, 10-13 September 2019, Siena, Italyen_GB
dc.identifier.journalLecture Notes in Computer Scienceen_GB
dc.rights.urihttp://www.rioxx.net/licenses/all-rights-reserveden_GB
dcterms.dateAccepted2019-06-11
exeter.funder::Natural Environment Research Council (NERC)en_GB
rioxxterms.versionAMen_GB
rioxxterms.licenseref.startdate2019-06-11
rioxxterms.typeConference Paper/Proceeding/Abstracten_GB
refterms.dateFCD2019-06-25T12:36:52Z
refterms.versionFCDAM
refterms.dateFOA2020-02-10T13:51:50Z
refterms.panelBen_GB


Files in this item

This item appears in the following Collection(s)

Show simple item record