Show simple item record

dc.contributor.authorYing, Yiming
dc.contributor.authorZhou, Ding-Xuan
dc.date.accessioned2013-07-22T15:33:48Z
dc.date.issued2007
dc.description.abstractGaussian kernels with flexible variances provide a rich family of Mercer kernels for learning algorithms. We show that the union of the unit balls of reproducing kernel Hilbert spaces generated by Gaussian kernels with fexible variances is a uniform Glivenko-Cantelli (uGC) class. This result confirms a conjecture concerning learnability of Gaussian kernels and verifies the uniform convergence of many learning algorithms involving Gaussians with changing variances. Rademacher averages and empirical covering numbers are used to estimate sample errors of multi-kernel regularization schemes associated with general loss functions. It is then shown that the regularization error associated with the least square loss and the Gaussian kernels can be greatly improved when °exible variances are allowed. Finally, for regularization schemes generated by Gaussian kernels with fexible variances we present explicit learning rates for regression with least square loss and classification with hinge loss.en_GB
dc.identifier.citationVol. 8, pp. 249-276en_GB
dc.identifier.urihttp://hdl.handle.net/10871/11961
dc.language.isoenen_GB
dc.publisherMicrotome Publishingen_GB
dc.relation.urlhttps://jmlr.org/papers/v8/ying07a.htmlen_GB
dc.subjectGaussian kernelen_GB
dc.subjectflexible variancesen_GB
dc.subjectlearning theoryen_GB
dc.subjectGlivenko-Cantelli classen_GB
dc.subjectregularization schemeen_GB
dc.subjectempirical covering numberen_GB
dc.titleLearnability of Gaussians with flexible variancesen_GB
dc.typeArticleen_GB
dc.date.available2013-07-22T15:33:48Z
dc.identifier.issn1532-4435
dc.descriptionCopyright © 2007 Yiming Ying and Ding-Xuan Zhouen_GB
dc.identifier.eissn1533-7928
dc.identifier.journalJournal of Machine Learning Researchen_GB


Files in this item

This item appears in the following Collection(s)

Show simple item record