This paper examines the circuit complexity of feedforward neural networks having sigmoid activation function. The starting point is the complexity class NN k defined in . First two additional complexity classes NN D k and NN D,e k having less restrictive conditions (than NN k ) concerning fan-in and accuracy are defined. We then prove several relations among these three classes and well established circuit complexity classes. All the proofs are constructive as being built around binary adders. By relaxing the fan-in condition from logarithmic  to (almost) polynomial, the results show that substantially larger classes of sigmoid activation feedforward neural networks can be implemented in polynomial size Boolean circuits. This is done at the expense of a logarithmic factor increase in the number of layers, but having constant fan-in (of 2). The main conclusion is that there are interesting fan-in dependent depth-size tradeoffs when trying to digitally impleme...
|Title of host publication||World Conference on Neural Networks|
|Publisher||Lawerence Erlbaum Associates, Inc., and INNS Press|
|Publication status||Published - Jun 4 1994|
|Event||WCNN'94 - San Diego, CA, USA|
Duration: Jun 4 1994 → …
|Period||6/4/94 → …|