Risk Bounds for Regularized Least-squares Algorithm with Operator-valued kernels
dc.date.accessioned | 2005-12-22T02:28:54Z | |
dc.date.accessioned | 2018-11-24T10:24:28Z | |
dc.date.available | 2005-12-22T02:28:54Z | |
dc.date.available | 2018-11-24T10:24:28Z | |
dc.date.issued | 2005-05-16 | |
dc.identifier.uri | http://hdl.handle.net/1721.1/30543 | |
dc.identifier.uri | http://repository.aust.edu.ng/xmlui/handle/1721.1/30543 | |
dc.description.abstract | We show that recent results in [3] on risk bounds for regularized least-squares on reproducing kernel Hilbert spaces can be straightforwardly extended to the vector-valued regression setting. We first briefly introduce central concepts on operator-valued kernels. Then we show how risk bounds can be expressed in terms of a generalization of effective dimension. | |
dc.format.extent | 17 p. | |
dc.format.extent | 12090406 bytes | |
dc.format.extent | 642646 bytes | |
dc.language.iso | en_US | |
dc.subject | AI | |
dc.subject | optimal rates | |
dc.subject | reproducing kernel Hilbert space | |
dc.subject | effective dimension | |
dc.title | Risk Bounds for Regularized Least-squares Algorithm with Operator-valued kernels |
Files in this item
Files | Size | Format | View |
---|---|---|---|
MIT-CSAIL-TR-2005-031.pdf | 642.6Kb | application/pdf | View/ |
MIT-CSAIL-TR-2005-031.ps | 12.09Mb | application/postscript | View/ |