Show simple item record

Empirical Effective Dimension and Optimal Rates for Regularized Least Squares Algorithm

dc.date.accessioned2005-12-22T02:29:53Z
dc.date.accessioned2018-11-24T10:24:30Z
dc.date.available2005-12-22T02:29:53Z
dc.date.available2018-11-24T10:24:30Z
dc.date.issued2005-05-27
dc.identifier.urihttp://hdl.handle.net/1721.1/30548
dc.identifier.urihttp://repository.aust.edu.ng/xmlui/handle/1721.1/30548
dc.description.abstractThis paper presents an approach to model selection for regularized least-squares on reproducing kernel Hilbert spaces in the semi-supervised setting. The role of effective dimension was recently shown to be crucial in the definition of a rule for the choice of the regularization parameter, attaining asymptotic optimal performances in a minimax sense. The main goal of the present paper is showing how the effective dimension can be replaced by an empirical counterpart while conserving optimality. The empirical effective dimension can be computed from independent unlabelled samples. This makes the approach particularly appealing in the semi-supervised setting.
dc.format.extent14 p.
dc.format.extent11158573 bytes
dc.format.extent526018 bytes
dc.language.isoen_US
dc.subjectAI
dc.subjectoptimal rates
dc.subjecteffective dimension
dc.subjectsemi-supervised learning
dc.titleEmpirical Effective Dimension and Optimal Rates for Regularized Least Squares Algorithm


Files in this item

FilesSizeFormatView
MIT-CSAIL-TR-2005-036.pdf526.0Kbapplication/pdfView/Open
MIT-CSAIL-TR-2005-036.ps11.15Mbapplication/postscriptView/Open

This item appears in the following Collection(s)

Show simple item record