Show simple item record

Risk Bounds for Regularized Least-squares Algorithm with Operator-valued kernels

dc.date.accessioned2005-12-22T02:28:54Z
dc.date.accessioned2018-11-24T10:24:28Z
dc.date.available2005-12-22T02:28:54Z
dc.date.available2018-11-24T10:24:28Z
dc.date.issued2005-05-16
dc.identifier.urihttp://hdl.handle.net/1721.1/30543
dc.identifier.urihttp://repository.aust.edu.ng/xmlui/handle/1721.1/30543
dc.description.abstractWe show that recent results in [3] on risk bounds for regularized least-squares on reproducing kernel Hilbert spaces can be straightforwardly extended to the vector-valued regression setting. We first briefly introduce central concepts on operator-valued kernels. Then we show how risk bounds can be expressed in terms of a generalization of effective dimension.
dc.format.extent17 p.
dc.format.extent12090406 bytes
dc.format.extent642646 bytes
dc.language.isoen_US
dc.subjectAI
dc.subjectoptimal rates
dc.subjectreproducing kernel Hilbert space
dc.subjecteffective dimension
dc.titleRisk Bounds for Regularized Least-squares Algorithm with Operator-valued kernels


Files in this item

FilesSizeFormatView
MIT-CSAIL-TR-2005-031.pdf642.6Kbapplication/pdfView/Open
MIT-CSAIL-TR-2005-031.ps12.09Mbapplication/postscriptView/Open

This item appears in the following Collection(s)

Show simple item record