Show simple item record

A Note on Support Vector Machines Degeneracy

dc.date.accessioned2004-10-22T20:17:55Z
dc.date.accessioned2018-11-24T10:23:43Z
dc.date.available2004-10-22T20:17:55Z
dc.date.available2018-11-24T10:23:43Z
dc.date.issued1999-08-11en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/7291
dc.identifier.urihttp://repository.aust.edu.ng/xmlui/handle/1721.1/7291
dc.description.abstractWhen training Support Vector Machines (SVMs) over non-separable data sets, one sets the threshold $b$ using any dual cost coefficient that is strictly between the bounds of $0$ and $C$. We show that there exist SVM training problems with dual optimal solutions with all coefficients at bounds, but that all such problems are degenerate in the sense that the "optimal separating hyperplane" is given by ${f w} = {f 0}$, and the resulting (degenerate) SVM will classify all future points identically (to the class that supplies more training data). We also derive necessary and sufficient conditions on the input data for this to occur. Finally, we show that an SVM training problem can always be made degenerate by the addition of a single data point belonging to a certain unboundedspolyhedron, which we characterize in terms of its extreme points and rays.en_US
dc.format.extent10 p.en_US
dc.format.extent1117769 bytes
dc.format.extent262084 bytes
dc.language.isoen_US
dc.subjectAIen_US
dc.subjectMITen_US
dc.subjectArtificial Intelligenceen_US
dc.subjectSupport Vector Machinesen_US
dc.subjectScale Sensitive Loss Functionen_US
dc.subjectStatistical Learning Theory.en_US
dc.titleA Note on Support Vector Machines Degeneracyen_US


Files in this item

FilesSizeFormatView
AIM-1661.pdf262.0Kbapplication/pdfView/Open
AIM-1661.ps1.117Mbapplication/postscriptView/Open

This item appears in the following Collection(s)

Show simple item record