NASA Logo

NTRS

NTRS - NASA Technical Reports Server

Back to Results
A simple method to derive bounds on the size and to train multilayer neural networksA new derivation is presented for the bounds on the size of a multilayer neural network to exactly implement an arbitrary training set; namely, the training set can be implemented with zero error with two layers and with the number of the hidden-layer neurons equal to no.1 is greater than p - 1. The derivation does not require the separation of the input space by particular hyperplanes, as in previous derivations. The weights for the hidden layer can be chosen almost arbitrarily, and the weights for the output layer can be found by solving no.1 + 1 linear equations. The method presented exactly solves (M), the multilayer neural network training problem, for any arbitrary training set.
Document ID
19910057851
Acquisition Source
Legacy CDMS
Document Type
Reprint (Version printed in journal)
External Source(s)
Authors
Sartori, Michael A.
(Notre Dame Univ. IN, United States)
Antsaklis, Panos J.
(Notre Dame, University IN, United States)
Date Acquired
August 15, 2013
Publication Date
July 1, 1991
Publication Information
Publication: IEEE Transactions on Neural Networks
Volume: 2
ISSN: 1045-9227
Subject Category
Cybernetics
Accession Number
91A42474
Funding Number(s)
CONTRACT_GRANT: JPL-957856
Distribution Limits
Public
Copyright
Other

Available Downloads

There are no available downloads for this record.
No Preview Available