Show simple item record

dc.contributor.authorMohammadi, H
dc.contributor.authorLe Riche, R
dc.contributor.authorDurrande, N
dc.contributor.authorTouboul, E
dc.contributor.authorBay, X
dc.date.accessioned2017-03-23T14:54:53Z
dc.date.issued2016-06-08
dc.description.abstractGaussian Processes (GPs) are often used to predict the output of a parameterized deterministic experiment. They have many applications in the field of Computer Experiments, in particular to perform sensitivity analysis, adaptive design of experiments and global optimization. Nearly all of the applications of GPs to Computer Experiments require the inversion of a covariance matrix. Because this matrix is often ill-conditioned, regularization techniques are required. Today, there is still a need to better regularize GPs.The two most classical regularization methods to avoid degeneracy of the covariance matrix are i) pseudoinverse (PI) and ii) adding a small positive constant to the main diagonal, i.e., the case of noisy observations. Herein, we will refer to the second regularization technique with a slight abuse of language as nugget. This paper provides algebraic calculations which allow comparing PI and nugget regularizations. It is proven that pseudoinverse regularization averages the output values and makes the variance null at redundant points. On the opposite, nugget regularization lacks interpolationproperties but preserves a non-zero variance at every point. However, these two regularization techniques become similar as the nugget value decreases. A distribution-wise GP is introduced which interpolates Gaussian distributions instead of data points and mitigates the drawbacks of pseudoinverse and nugget regularized GPs. Finally, data-model discrepancy is discussed and serves as a guide for choosing a regularization technique.en_GB
dc.description.sponsorshipThe authors would like to acknowledge support by the French national research agency (ANR) within the Modèles Numérique project “NumBBO- Analysis, Improvement and Evaluation of Numerical Blackbox Optimizers”.en_GB
dc.identifier.urihttp://hdl.handle.net/10871/26760
dc.language.isoenen_GB
dc.publisherEcole Nationale Supérieure des Mines de Saint-Etienneen_GB
dc.subjectGaussian processen_GB
dc.subjectnuggeten_GB
dc.subjectPseudo-inverseen_GB
dc.subjectcovariance matrixen_GB
dc.subjectDegeneracy of covariance matricesen_GB
dc.subjectGaussian process regressionen_GB
dc.subjectKrigingen_GB
dc.subjectRegularizationen_GB
dc.titleAn analytic comparison of regularization methods for Gaussian processesen_GB
dc.typeReporten_GB
dc.date.available2017-03-23T14:54:53Z
exeter.confidentialfalseen_GB


Files in this item

This item appears in the following Collection(s)

Show simple item record