Show simple item record

Estimating Dependency Structure as a Hidden Variable

dc.date.accessioned2004-10-20T21:04:25Z
dc.date.accessioned2018-11-24T10:23:33Z
dc.date.available2004-10-20T21:04:25Z
dc.date.available2018-11-24T10:23:33Z
dc.date.issued1998-09-01en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/7257
dc.identifier.urihttp://repository.aust.edu.ng/xmlui/handle/1721.1/7257
dc.description.abstractThis paper introduces a probability model, the mixture of trees that can account for sparse, dynamically changing dependence relationships. We present a family of efficient algorithms that use EM and the Minimum Spanning Tree algorithm to find the ML and MAP mixture of trees for a variety of priors, including the Dirichlet and the MDL priors. We also show that the single tree classifier acts like an implicit feature selector, thus making the classification performance insensitive to irrelevant attributes. Experimental results demonstrate the excellent performance of the new model both in density estimation and in classification.en_US
dc.format.extent1320254 bytes
dc.format.extent477415 bytes
dc.language.isoen_US
dc.titleEstimating Dependency Structure as a Hidden Variableen_US


Files in this item

FilesSizeFormatView
AIM-1648.pdf477.4Kbapplication/pdfView/Open
AIM-1648.ps1.320Mbapplication/postscriptView/Open

This item appears in the following Collection(s)

Show simple item record