University of Exeter
Browse

Generalization bounds for learning the kernel

Download (215.1 kB)
conference contribution
posted on 2025-07-30, 21:44 authored by Yiming Ying, Colin Campbell
In this paper we develop a novel probabilistic generalization bound for learning the kernel problem. First, we show that the generalization analysis of the kernel learning algorithms reduces to investigation of the suprema of the Rademacher chaos process of order two over candidate kernels, which we refer to as Rademacher chaos complexity. Next, we show how to estimate the empirical Rademacher chaos complexity by well-established metric entropy integrals and pseudo-dimension of the set of candidate kernels. Our new methodology mainly depends on the principal theory of U-processes. Finally, we establish satisfactory excess generalization bounds and misclassification error rates for learning Gaussian kernels and general radial basis kernels.

History

Language

en

Citation

22nd Annual Conference on Learning Theory (COLT 2009), Montreal, Canada, 18-21 June 2009

Department

  • Computer Science

Usage metrics

    University of Exeter

    Categories

    No categories selected

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC