Abstract
In a previous work we showed that hardlimit multilayer neural networks have more computational power than sigmoidal multilayer neural networks [1]. In 1962 Minsky and Papert showed the limitations of a single perceptron which can only solve linearly separable classification problems and since at that time there was no algorithm to find the weights of a multilayer hardlimit perceptron research on neural networks stagnated until the early eighties with the invention of the Backpropagation algorithm [2]. Nevertheless since the sixties there have arisen some proposals of algorithms to implement logical functions with threshold elements or hardlimit neurons that could have been adapted to classification problems with multilayer hardlimit perceptrons and this way the stagnation of research on neural networks could have been avoided. Although the problem of training a hardlimit neural network is NP-Complete, our algorithm based on mathematical programming, a mixed integer linear model (MILP), takes few seconds to train the two input XOR function and a simple logical function of three variables with two minterms. Since any linearly separable logical function can be implemented by a perceptron with integer weights, varying them between -1 and 1 we found all the 10 possible solutions for the implementation of the two input XOR function and all the 14 and 18 possible solutions for the implementation of two logical functions of three variables, respectively, with a two layer architecture, with two neurons in the first layer. We describe our MILP model and show why it consumes a lot of computational resources, even a small hardlimit neural network translates into a MILP model greater than 1G, implying the use of a more powerful computer than a common 32 bits PC. We consider the reduction of computational resources as the near future work main objective to improve our novel MILP model and we will also try a nonlinear version of our algorithm based on a MINLP model that will consume less memory.
Original language | English |
---|---|
Title of host publication | ADVANCES IN COMPUTATIONAL INTELLIGENCE, PT II |
Editors | I. Rojas, G. Joya, A. Catala |
Place of Publication | Germany |
Publisher | Springer Verlag |
Pages | 477-487 |
ISBN (Electronic) | 978-3-319-19221-5 |
ISBN (Print) | 978-3-319-19222-2 |
DOIs | |
Publication status | Published - Jun 2015 |
Event | 13th International Work-Conference on Artificial Neural Networks, IWANN 2015 - Palma de Mallorca, Spain Duration: 10 Jun 2015 → 12 Jun 2015 Conference number: 13th |
Publication series
Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|
Publisher | Springer-Verlag |
Volume | 9095 |
ISSN (Print) | 0302-9743 |
Conference
Conference | 13th International Work-Conference on Artificial Neural Networks, IWANN 2015 |
---|---|
Abbreviated title | IWANN 2015 |
Country/Territory | Spain |
City | Palma de Mallorca |
Period | 10/06/15 → 12/06/15 |
Keywords
- Hardlimit neural networks
- Mixed integer linear programming
- Training a hardlimit neural network with a MILP model
- Solving a MILP model with the cplex solver