Show simple item record

Fast Rates for Regularized Least-squares Algorithm

dc.date.accessioned2005-12-22T02:28:27Z
dc.date.accessioned2018-11-24T10:24:28Z
dc.date.available2005-12-22T02:28:27Z
dc.date.available2018-11-24T10:24:28Z
dc.date.issued2005-04-14
dc.identifier.urihttp://hdl.handle.net/1721.1/30539
dc.identifier.urihttp://repository.aust.edu.ng/xmlui/handle/1721.1/30539
dc.description.abstractWe develop a theoretical analysis of generalization performances of regularized least-squares on reproducing kernel Hilbert spaces for supervised learning. We show that the concept of effective dimension of an integral operator plays a central role in the definition of a criterion for the choice of the regularization parameter as a function of the number of samples. In fact, a minimax analysis is performed which shows asymptotic optimality of the above-mentioned criterion.
dc.format.extent25 p.
dc.format.extent16130108 bytes
dc.format.extent833989 bytes
dc.language.isoen_US
dc.subjectAI
dc.subjectoptimal rates
dc.subjectregularized least-squares
dc.subjectreproducing kernel Hilbert space
dc.subjecteffe
dc.titleFast Rates for Regularized Least-squares Algorithm


Files in this item

FilesSizeFormatView
MIT-CSAIL-TR-2005-027.pdf833.9Kbapplication/pdfView/Open
MIT-CSAIL-TR-2005-027.ps16.13Mbapplication/postscriptView/Open

This item appears in the following Collection(s)

Show simple item record