In this paper we consider binary neurons having a threshold nonlinear transfer function and detail a novel direct design algorithm as an alternative to the classical learning algorithms which determines the number of layers, the number of neurons in each layer and the synaptic weights of a particular neural network. While the feedforward neural network is described by m examples of n bits each, the optimisation criteria are changed. Beside the classical size-and-depth we also use the A and the AT 2 complexity measures of VLSI circuits (A being the area of the chip, and T the delay for propagating the inputs to the outputs). We considering the maximum fan-in of one neuron as a parameter and proceed to show its influence on the area, obtaining a full class of solutions. Results are compared with another constructive algorithm. Further directions for research are pointed out in the conclusions, together with some open questions.
|Title of host publication||International Conference on Artificial Neural Nets and Genetic Algorithms|
|Publication status||Published - Apr 19 1995|
|Event||ICANNGA'95 - Alès, France|
Duration: Apr 19 1995 → …
|Period||4/19/95 → …|