Learning and optimization with cascaded VLSI neural network building-block chipsTo demonstrate the versatility of the building-block approach, two neural network applications were implemented on cascaded analog VLSI chips. Weights were implemented using 7-b multiplying digital-to-analog converter (MDAC) synapse circuits, with 31 x 32 and 32 x 32 synapses per chip. A novel learning algorithm compatible with analog VLSI was applied to the two-input parity problem. The algorithm combines dynamically evolving architecture with limited gradient-descent backpropagation for efficient and versatile supervised learning. To implement the learning algorithm in hardware, synapse circuits were paralleled for additional quantization levels. The hardware-in-the-loop learning system allocated 2-5 hidden neurons for parity problems. Also, a 7 x 7 assignment problem was mapped onto a cascaded 64-neuron fully connected feedback network. In 100 randomly selected problems, the network found optimal or good solutions in most cases, with settling times in the range of 7-100 microseconds.
Document ID
19930053005
Acquisition Source
Legacy CDMS
Document Type
Conference Paper
Authors
Duong, T. (Jet Propulsion Lab., California Inst. of Tech. Pasadena, CA, United States)
Eberhardt, S. P. (Jet Propulsion Lab., California Inst. of Tech. Pasadena, CA, United States)
Tran, M. (Jet Propulsion Lab., California Inst. of Tech. Pasadena, CA, United States)
Daud, T. (Jet Propulsion Lab., California Inst. of Tech. Pasadena, CA, United States)
Thakoor, A. P. (JPL Pasadena, CA, United States)
Date Acquired
August 16, 2013
Publication Date
January 1, 1992
Publication Information
Publication: In: IJCNN - International Joint Conference on Neural Networks, Baltimore, MD, June 7-11, 1992, Proceedings. Vol. 1 (A93-37001 14-63)
Publisher: Institute of Electrical and Electronics Engineers, Inc.