Show simple item record

Transfering Nonlinear Representations using Gaussian Processes with a Shared Latent Space

dc.date.accessioned2007-11-13T14:45:17Z
dc.date.accessioned2018-11-24T10:25:47Z
dc.date.available2007-11-13T14:45:17Z
dc.date.available2018-11-24T10:25:47Z
dc.date.issued2007-11-06en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/39426
dc.identifier.urihttp://repository.aust.edu.ng/xmlui/handle/1721.1/39426
dc.description.abstractWhen a series of problems are related, representations derived fromlearning earlier tasks may be useful in solving later problems. Inthis paper we propose a novel approach to transfer learning withlow-dimensional, non-linear latent spaces. We show how suchrepresentations can be jointly learned across multiple tasks in adiscriminative probabilistic regression framework. When transferred tonew tasks with relatively few training examples, learning can befaster and/or more accurate. Experiments on a digit recognition taskshow significantly improved performance when compared to baselineperformance with the original feature representation or with arepresentation derived from a semi-supervised learning approach.en_US
dc.format.extent8 p.en_US
dc.relationMassachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratoryen_US
dc.relationen_US
dc.subjecttransfer learningen_US
dc.subjectlatent variable modelsen_US
dc.titleTransfering Nonlinear Representations using Gaussian Processes with a Shared Latent Spaceen_US


Files in this item

FilesSizeFormatView
MIT-CSAIL-TR-2007-053.pdf327.4Kbapplication/pdfView/Open
MIT-CSAIL-TR-2007-053.ps798.2Kbapplication/postscriptView/Open

This item appears in the following Collection(s)

Show simple item record