A Greedy Iterative Layered Framework for Training Feed Forward Neural Networks

L. L. Custode, C. L. Tecce, I. Bakurov, Mauro Castelli, A. Della Cioppa, Leonardo Vanneschi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In recent years neuroevolution has become a dynamic and rapidly growing research field. Interest in this discipline is motivated by the need to create ad-hoc networks, the topology and parameters of which are optimized, according to the particular problem at hand. Although neuroevolution-based techniques can contribute fundamentally to improving the performance of artificial neural networks (ANNs), they present a drawback, related to the massive amount of computational resources needed. This paper proposes a novel population-based framework, aimed at finding the optimal set of synaptic weights for ANNs. The proposed method partitions the weights of a given network and, using an optimization heuristic, trains one layer at each step while “freezing” the remaining weights. In the experimental study, particle swarm optimization (PSO) was used as the underlying optimizer within the framework and its performance was compared against the standard training (i.e., training that considers the whole set of weights) of the network with PSO and the backward propagation of the errors (backpropagation). Results show that the subsequent training of sub-spaces reduces training time, achieves better generalizability, and leads to the exhibition of smaller variance in the architectural aspects of the network.

Original languageEnglish
Title of host publicationApplications of Evolutionary Computation - 23rd European Conference, EvoApplications 2020, Held as Part of EvoStar 2020, Proceedings
EditorsPedro A. Castillo, Juan Luis Jiménez Laredo, Francisco Fernández de Vega
PublisherSpringer
Pages513-529
Number of pages17
ISBN (Print)9783030437213
DOIs
Publication statusPublished - 9 Apr 2020
Event23rd European Conference on Applications of Evolutionary Computation, EvoApplications 2020, held as part of EvoStar 2020 - Seville, Spain
Duration: 15 Apr 202017 Apr 2020

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12104 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference23rd European Conference on Applications of Evolutionary Computation, EvoApplications 2020, held as part of EvoStar 2020
CountrySpain
CitySeville
Period15/04/2017/04/20

Keywords

  • Artificial neural networks
  • Neuroevolution
  • Particle swarm optimization

Fingerprint Dive into the research topics of 'A Greedy Iterative Layered Framework for Training Feed Forward Neural Networks'. Together they form a unique fingerprint.

Cite this