Show simple item record

dc.contributor.authorMazumdar, A
dc.contributor.authorLópez-Ibáñez, M
dc.contributor.authorChugh, T
dc.contributor.authorHakanen, J
dc.contributor.authorMiettinen, K
dc.date.accessioned2023-05-04T14:41:34Z
dc.date.issued2023-04-28
dc.date.updated2023-05-04T14:18:34Z
dc.description.abstractFor offline data-driven multiobjective optimization problems (MOPs), no new data is available during the optimization process. Approximation models (or surrogates) are first built using the provided offline data and an optimizer, e.g. a multiobjective evolutionary algorithm, can then be utilized to find Pareto optimal solutions to the problem with surrogates as objective functions. In contrast to online data-driven MOPs, these surrogates cannot be updated with new data and, hence, the approximation accuracy cannot be improved by considering new data during the optimization process. Gaussian process regression (GPR) models are widely used as surrogates because of their ability to provide uncertainty information. However, building GPRs becomes computationally expensive when the size of the dataset is large. Using sparse GPRs reduces the computational cost of building the surrogates. However, sparse GPRs are not tailored to solve offline data-driven MOPs, where good accuracy of the surrogates is needed near Pareto optimal solutions. Treed GPR (TGPR-MO) surrogates for offline data-driven MOPs with continuous decision variables are proposed in this paper. The proposed surrogates first split the decision space into subregions using regression trees and build GPRs sequentially in regions close to Pareto optimal solutions in the decision space to accurately approximate tradeoffs between the objective functions. TGPR-MO surrogates are computationally inexpensive because GPRs are built only in a smaller region of the decision space utilizing a subset of the data. The TGPR-MO surrogates were tested on distance-based visualizable problems with various data sizes, sampling strategies, numbers of objective functions, and decision variables. Experimental results showed that the TGPR-MO surrogates are computationally cheaper and can handle datasets of large size. Furthermore, TGPR-MO surrogates produced solutions closer to Pareto optimal solutions compared to full GPRs and sparse GPRs.en_GB
dc.description.sponsorshipAcademy of Finlanden_GB
dc.format.extent1-24
dc.identifier.citationPublished online 28 April 2023en_GB
dc.identifier.doihttps://doi.org/10.1162/evco_a_00329
dc.identifier.grantnumber311877en_GB
dc.identifier.grantnumber322221en_GB
dc.identifier.urihttp://hdl.handle.net/10871/133083
dc.identifierORCID: 0000-0001-5123-8148 (Chugh, Tinkle)
dc.language.isoenen_GB
dc.publisherMIT Pressen_GB
dc.rights.embargoreasonUnder embargo until 28 July 2023 in compliance with publisher policyen_GB
dc.rights© 2023 Massachusetts Institute of Technologyen_GB
dc.subjectGaussian processesen_GB
dc.subjectKrigingen_GB
dc.subjectRegression treesen_GB
dc.subjectMetamodellingen_GB
dc.subjectSurrogateen_GB
dc.subjectPareto optimalityen_GB
dc.titleTreed Gaussian Process Regression for Solving Offline Data-Driven Continuous Multiobjective Optimization Problemsen_GB
dc.typeArticleen_GB
dc.date.available2023-05-04T14:41:34Z
dc.descriptionThis is the final version. Available from MIT Press via the DOI in this recorden_GB
dc.identifier.eissn1530-9304
dc.identifier.journalEvolutionary Computationen_GB
dc.relation.ispartofEvolutionary Computation
dc.rights.urihttp://www.rioxx.net/licenses/all-rights-reserveden_GB
rioxxterms.versionVoRen_GB
rioxxterms.licenseref.startdate2023-04-28
rioxxterms.typeJournal Article/Reviewen_GB
refterms.dateFCD2023-05-04T14:37:21Z
refterms.versionFCDVoR
refterms.dateFOA2023-07-27T23:00:00Z
refterms.panelBen_GB
refterms.dateFirstOnline2023-04-28


Files in this item

This item appears in the following Collection(s)

Show simple item record