TY - GEN
T1 - A Greedy Iterative Layered Framework for Training Feed Forward Neural Networks
AU - Custode, L. L.
AU - Tecce, C. L.
AU - Bakurov, Illya
AU - Castelli, Mauro
AU - Cioppa, A. Della
AU - Vanneschi, Leonardo
N1 - info:eu-repo/grantAgreement/FCT/9471 - RIDTI/PTDC%2FCCI-CIF%2F29877%2F2017/PT#
info:eu-repo/grantAgreement/FCT/3599-PPCDT/DSAIPA%2FDS%2F0113%2F2019/PT#
info:eu-repo/grantAgreement/FCT/3599-PPCDT/DSAIPA%2FDS%2F0022%2F2018/PT#
info:eu-repo/grantAgreement/FCT/3599-PPCDT/PTDC%2FCCI-INF%2F29168%2F2017/PT#
Custode, L. L., Tecce, C. L., Bakurov, I., Castelli, M., Cioppa, A. D., & Vanneschi, L. (2020). A Greedy Iterative Layered Framework for Training Feed Forward Neural Networks. In P. A. Castillo, J. L. Jiménez Laredo, & F. Fernández de Vega (Eds.), Applications of Evolutionary Computation - 23rd European Conference, EvoApplications 2020, Held as Part of EvoStar 2020, Proceedings (pp. 513-529). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 12104 LNCS). Springer. https://doi.org/10.1007/978-3-030-43722-0_33
PY - 2020/4/9
Y1 - 2020/4/9
N2 - In recent years neuroevolution has become a dynamic and rapidly growing research field. Interest in this discipline is motivated by the need to create ad-hoc networks, the topology and parameters of which are optimized, according to the particular problem at hand. Although neuroevolution-based techniques can contribute fundamentally to improving the performance of artificial neural networks (ANNs), they present a drawback, related to the massive amount of computational resources needed. This paper proposes a novel population-based framework, aimed at finding the optimal set of synaptic weights for ANNs. The proposed method partitions the weights of a given network and, using an optimization heuristic, trains one layer at each step while “freezing” the remaining weights. In the experimental study, particle swarm optimization (PSO) was used as the underlying optimizer within the framework and its performance was compared against the standard training (i.e., training that considers the whole set of weights) of the network with PSO and the backward propagation of the errors (backpropagation). Results show that the subsequent training of sub-spaces reduces training time, achieves better generalizability, and leads to the exhibition of smaller variance in the architectural aspects of the network.
AB - In recent years neuroevolution has become a dynamic and rapidly growing research field. Interest in this discipline is motivated by the need to create ad-hoc networks, the topology and parameters of which are optimized, according to the particular problem at hand. Although neuroevolution-based techniques can contribute fundamentally to improving the performance of artificial neural networks (ANNs), they present a drawback, related to the massive amount of computational resources needed. This paper proposes a novel population-based framework, aimed at finding the optimal set of synaptic weights for ANNs. The proposed method partitions the weights of a given network and, using an optimization heuristic, trains one layer at each step while “freezing” the remaining weights. In the experimental study, particle swarm optimization (PSO) was used as the underlying optimizer within the framework and its performance was compared against the standard training (i.e., training that considers the whole set of weights) of the network with PSO and the backward propagation of the errors (backpropagation). Results show that the subsequent training of sub-spaces reduces training time, achieves better generalizability, and leads to the exhibition of smaller variance in the architectural aspects of the network.
KW - Artificial neural networks
KW - Neuroevolution
KW - Particle swarm optimization
UR - http://www.scopus.com/inward/record.url?scp=85084752653&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-43722-0_33
DO - 10.1007/978-3-030-43722-0_33
M3 - Conference contribution
AN - SCOPUS:85084752653
SN - 9783030437213
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 513
EP - 529
BT - Applications of Evolutionary Computation - 23rd European Conference, EvoApplications 2020, Held as Part of EvoStar 2020, Proceedings
A2 - Castillo, Pedro A.
A2 - Jiménez Laredo, Juan Luis
A2 - Fernández de Vega, Francisco
PB - Springer
T2 - 23rd European Conference on Applications of Evolutionary Computation, EvoApplications 2020, held as part of EvoStar 2020
Y2 - 15 April 2020 through 17 April 2020
ER -