Show simple item record

dc.contributor.authorYing, Yiming
dc.contributor.authorCampbell, Colin
dc.date.accessioned2013-07-22T15:56:27Z
dc.date.issued2010-11-01
dc.description.abstractWe develop a novel generalization bound for learning the kernel problem. First, we show that the generalization analysis of the kernel learning problem reduces to investigation of the suprema of the Rademacher chaos process of order 2 over candidate kernels, which we refer to as Rademacher chaos complexity. Next, we show how to estimate the empirical Rademacher chaos complexity by well-established metric entropy integrals and pseudo-dimension of the set of candidate kernels. Our new methodology mainly depends on the principal theory of U-processes and entropy integrals. Finally, we establish satisfactory excess generalization bounds and misclassification error rates for learning gaussian kernels and general radial basis kernels.en_GB
dc.identifier.citationVol. 22 (11), pp. 2858 - 2886en_GB
dc.identifier.doi10.1162/NECO_a_00028
dc.identifier.urihttp://hdl.handle.net/10871/11962
dc.language.isoenen_GB
dc.publisherMIT Pressen_GB
dc.subjectAlgorithmsen_GB
dc.subjectLearningen_GB
dc.subjectNeural Networks (Computer)en_GB
dc.titleRademacher chaos complexities for learning the kernel problemen_GB
dc.typeArticleen_GB
dc.date.available2013-07-22T15:56:27Z
dc.identifier.issn0899-7667
exeter.place-of-publicationUnited States
dc.descriptionCopyright © 2010 The MIT Pressen_GB
dc.identifier.eissn1530-888X
dc.identifier.journalNeural Computationen_GB


Files in this item

This item appears in the following Collection(s)

Show simple item record