dc.date.accessioned | 2011-02-01T20:00:05Z | |
dc.date.accessioned | 2018-11-26T22:26:31Z | |
dc.date.available | 2011-02-01T20:00:05Z | |
dc.date.available | 2018-11-26T22:26:31Z | |
dc.date.issued | 2011-01-24 | |
dc.identifier.uri | http://hdl.handle.net/1721.1/60875 | |
dc.identifier.uri | http://repository.aust.edu.ng/xmlui/handle/1721.1/60875 | |
dc.description.abstract | In this paper we study a class of regularized kernel methods for vector-valued learning which are based on filtering the spectrum of the kernel matrix. The considered methods include Tikhonov regularization as a special case, as well as interesting alternatives such as vector-valued extensions of L2 boosting. Computational properties are discussed for various examples of kernels for vector-valued functions and the benefits of iterative techniques are illustrated. Generalizing previous results for the scalar case, we show finite sample bounds for the excess risk of the obtained estimator and, in turn, these results allow to prove consistency both for regression and multi-category classification. Finally, we present some promising results of the proposed algorithms on artificial and real data. | en_US |
dc.format.extent | 37 p. | en_US |
dc.subject | Computational Learning, Multi-Output Learning, Spectral Methods | en_US |
dc.title | Multi-Output Learning via Spectral Filtering | en_US |