dc.date.accessioned | 2005-12-22T02:28:27Z | |
dc.date.accessioned | 2018-11-24T10:24:28Z | |
dc.date.available | 2005-12-22T02:28:27Z | |
dc.date.available | 2018-11-24T10:24:28Z | |
dc.date.issued | 2005-04-14 | |
dc.identifier.uri | http://hdl.handle.net/1721.1/30539 | |
dc.identifier.uri | http://repository.aust.edu.ng/xmlui/handle/1721.1/30539 | |
dc.description.abstract | We develop a theoretical analysis of generalization performances of regularized least-squares on reproducing kernel Hilbert spaces for supervised learning. We show that the concept of effective dimension of an integral operator plays a central role in the definition of a criterion for the choice of the regularization parameter as a function of the number of samples. In fact, a minimax analysis is performed which shows asymptotic optimality of the above-mentioned criterion. | |
dc.format.extent | 25 p. | |
dc.format.extent | 16130108 bytes | |
dc.format.extent | 833989 bytes | |
dc.language.iso | en_US | |
dc.subject | AI | |
dc.subject | optimal rates | |
dc.subject | regularized least-squares | |
dc.subject | reproducing kernel Hilbert space | |
dc.subject | effe | |
dc.title | Fast Rates for Regularized Least-squares Algorithm | |