Are Rosenblatt Multilayer Perceptrons More Powerfull than Sigmoidal Multilayer Perceptrons? From a Counter Example to a General Result

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)


In the eighties the problem of the lack of an algorithm to train multilayer Rosenblatt perceptrons was solved by sigmoidal neural networks and backpropagation. But should we still try to find an efficient algorithm to train multilayer hardlimit neuronal networks known as a NP-Complete problem? In this paper we show that this would not be a waste of time by means of a counter example where a two layer Rosenblatt perceptron with 21 neurons showed much more computational power than a sigmoidal feedforward two layer neural network with 300 neurons trained by backpropagation for the same classification problem. We show why the synthesis of logical functions with threshold gates or hardlimit perceptrons is an active research area in VLSI design and nanotechnology and we review some of the methods to synthesize logical functions with a multilayer hardlimit perceptron and we propose the search for a new method to synthesize any classification problem with analogical inputs with a two layer hardlimit perceptron as a near future objective. Nevertheless we recognize that with hard limit multilayer perceptrons we cannot approximate continuous functions as with multilayer sigmoidal neural networks.
Original languageUnknown
Title of host publicationProceedings of the European Conference on Artificial Neural Networks
Publication statusPublished - 1 Jan 2013
EventESANN 2013 -
Duration: 1 Jan 2013 → …


ConferenceESANN 2013
Period1/01/13 → …

Cite this