Show simple item record

Properties of Support Vector Machines

dc.date.accessioned2004-10-20T21:04:01Z
dc.date.accessioned2018-11-24T10:23:30Z
dc.date.available2004-10-20T21:04:01Z
dc.date.available2018-11-24T10:23:30Z
dc.date.issued1997-08-01en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/7246
dc.identifier.urihttp://repository.aust.edu.ng/xmlui/handle/1721.1/7246
dc.description.abstractSupport Vector Machines (SVMs) perform pattern recognition between two point classes by finding a decision surface determined by certain points of the training set, termed Support Vectors (SV). This surface, which in some feature space of possibly infinite dimension can be regarded as a hyperplane, is obtained from the solution of a problem of quadratic programming that depends on a regularization parameter. In this paper we study some mathematical properties of support vectors and show that the decision surface can be written as the sum of two orthogonal terms, the first depending only on the margin vectors (which are SVs lying on the margin), the second proportional to the regularization parameter. For almost all values of the parameter, this enables us to predict how the decision surface varies for small parameter changes. In the special but important case of feature space of finite dimension m, we also show that there are at most m+1 margin vectors and observe that m+1 SVs are usually sufficient to fully determine the decision surface. For relatively small m this latter result leads to a consistent reduction of the SV number.en_US
dc.format.extent243488 bytes
dc.format.extent406239 bytes
dc.language.isoen_US
dc.titleProperties of Support Vector Machinesen_US


Files in this item

FilesSizeFormatView
AIM-1612.pdf406.2Kbapplication/pdfView/Open
AIM-1612.ps243.4Kbapplication/postscriptView/Open

This item appears in the following Collection(s)

Show simple item record