NASA Logo

NTRS

NTRS - NASA Technical Reports Server

Back to Results
Depth-size tradeoffs for neural computationThe tradeoffs between the depth (i.e., the time for parallel computation) and the size (i.e., the number of threshold gates) in neural networks are studied. The authors focus the study on the neural computations of symmetric Boolean functions and some arithmetic functions. It is shown that a significant reduction in the size is possible for symmetric functions and some arithmetic functions, at the expense of a small constant increase in depth. In the process, several neural networks which have the minimum size among all the known constructions have been developed. Results on implementing symmetric functions can be used to improve results about arbitrary Boolean functions. In particular, It is shown that any Boolean function can be computed in a depth-3 neural network with O(2n/2) threshold gates; it is also proven that a minimum number of threshold gates are required.
Document ID
19920039648
Acquisition Source
Legacy CDMS
Document Type
Reprint (Version printed in journal)
External Source(s)
Authors
Siu, Kai-Yeung
(California, University Irvine, United States)
Roychowdhury, Vwani P.
(Purdue University West Lafayette, IN, United States)
Kailath, Thomas
(Stanford University CA, United States)
Date Acquired
August 15, 2013
Publication Date
December 1, 1991
Publication Information
Publication: IEEE Transactions on Computers
Volume: 40
ISSN: 0018-9340
Subject Category
Cybernetics
Accession Number
92A22272
Funding Number(s)
CONTRACT_GRANT: DAAL03-88-C-0011
CONTRACT_GRANT: NAGW-419
CONTRACT_GRANT: DAAL03-90-G-0108
Distribution Limits
Public
Copyright
Other

Available Downloads

There are no available downloads for this record.
No Preview Available