Abstract
In the eighties the problem of the lack of an algorithm to train multilayer Rosenblatt perceptrons was solved by sigmoidal neural networks and backpropagation. But should we still try to find an efficient algorithm to train multilayer hardlimit neuronal networks known as a NP-Complete problem? In this paper we show that this would not be a waste of time by means of a counter example where a two layer Rosenblatt perceptron with 21 neurons showed much more computational power than a sigmoidal feedforward two layer neural network with 300 neurons trained by backpropagation for the same classification problem. We show why the synthesis of logical functions with threshold gates or hardlimit perceptrons is an active research area in VLSI design and nanotechnology and we review some of the methods to synthesize logical functions with a multilayer hardlimit perceptron and we propose the search for a new method to synthesize any classification problem with analogical inputs with a two layer hardlimit perceptron as a near future objective. Nevertheless we recognize that with hard limit multilayer perceptrons we cannot approximate continuous functions as with multilayer sigmoidal neural networks.
Original language | Unknown |
---|---|
Title of host publication | Proceedings of the European Conference on Artificial Neural Networks |
Pages | 345-350 |
Publication status | Published - 1 Jan 2013 |
Event | ESANN 2013 - Duration: 1 Jan 2013 → … |
Conference
Conference | ESANN 2013 |
---|---|
Period | 1/01/13 → … |