Show simple item record

A Theory of Networks for Appxoimation and Learning

dc.date.accessioned2004-10-04T15:14:19Z
dc.date.accessioned2018-11-24T10:14:27Z
dc.date.available2004-10-04T15:14:19Z
dc.date.available2018-11-24T10:14:27Z
dc.date.issued1989-07-01en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/6511
dc.identifier.urihttp://repository.aust.edu.ng/xmlui/handle/1721.1/6511
dc.description.abstractLearning an input-output mapping from a set of examples, of the type that many neural networks have been constructed to perform, can be regarded as synthesizing an approximation of a multi-dimensional function, that is solving the problem of hypersurface reconstruction. From this point of view, this form of learning is closely related to classical approximation techniques, such as generalized splines and regularization theory. This paper considers the problems of an exact representation and, in more detail, of the approximation of linear and nolinear mappings in terms of simpler functions of fewer variables. Kolmogorov's theorem concerning the representation of functions of several variables in terms of functions of one variable turns out to be almost irrelevant in the context of networks for learning. We develop a theoretical framework for approximation based on regularization techniques that leads to a class of three-layer networks that we call Generalized Radial Basis Functions (GRBF), since they are mathematically related to the well-known Radial Basis Functions, mainly used for strict interpolation tasks. GRBF networks are not only equivalent to generalized splines, but are also closely related to pattern recognition methods such as Parzen windows and potential functions and to several neural network algorithms, such as Kanerva's associative memory, backpropagation and Kohonen's topology preserving map. They also have an interesting interpretation in terms of prototypes that are synthesized and optimally combined during the learning stage. The paper introduces several extensions and applications of the technique and discusses intriguing analogies with neurobiological data.en_US
dc.format.extent409368 bytes
dc.format.extent1744677 bytes
dc.language.isoen_US
dc.titleA Theory of Networks for Appxoimation and Learningen_US


Files in this item

FilesSizeFormatView
AIM-1140.pdf1.744Mbapplication/pdfView/Open
AIM-1140.ps.Z409.3Kbapplication/octet-streamView/Open

This item appears in the following Collection(s)

Show simple item record