TY - JOUR
T1 - Geometric Semantic Genetic Programming with Normalized and Standardized Random Programs
AU - Bakurov, Illya
AU - Muñoz Contreras, José Manuel
AU - Castelli, Mauro
AU - Rodrigues, Nuno Miguel Duarte
AU - Silva, Sara
AU - Trujillo, Leonardo
AU - Vanneschi, Leonardo
N1 - info:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F04152%2F2020/PT#
info:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F00408%2F2020/PT#
info:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDP%2F00408%2F2020/PT#
Bakurov, I., Muñoz Contreras, J. M., Castelli, M., Rodrigues, N., Silva, S., Trujillo, L., & Vanneschi, L. (2024). Geometric Semantic Genetic Programming with Normalized and Standardized Random Programs. Genetic Programming And Evolvable Machines, 25, 1-29. Article 6. https://doi.org/10.1007/s10710-024-09479-1 --- This work was partially supported by FCT, Portugal, through funding of research units MagIC/NOVA IMS (UIDB/04152/2020) and LASIGE (UIDB/00408/2020 and UIDP/00408/2020). This work also was supported by CONACYT (Mexico) Project CF-2023-I-724, TecNM (Mexico) Project 16788.23-P and Project 17756.23-P. José Manuel Muñoz Contreras was supported by CONACYT scholarship 771416; Nuno Rodrigues was supported by FCT PhD Grant 2021/05322/BD.
PY - 2024/6/1
Y1 - 2024/6/1
N2 - Geometric semantic genetic programming (GSGP) represents one of the most promising developments in the area of evolutionary computation (EC) in the last decade. The results achieved by incorporating semantic awareness in the evolutionary process demonstrate the impact that geometric semantic operators have brought to the field of EC. An improvement to the geometric semantic mutation (GSM) operator is proposed, inspired by the results achieved by batch normalization in deep learning. While, in one of its most used versions, GSM relies on the use of the sigmoid function to constrain the semantics of two random programs responsible for perturbing the parent’s semantics, here a different approach is followed, which allows reducing the size of the resulting programs and overcoming the issues associated with the use of the sigmoid function, as commonly done in deep learning. The idea is to consider a single random program and use it to perturb the parent’s semantics only after standardization or normalization. The experimental results demonstrate the suitability of the proposed approach: despite its simplicity, the presented GSM variants outperform standard GSGP on the studied benchmarks, with a difference in terms of performance that is statistically significant. Furthermore, the individuals generated by the new GSM variants are easier to simplify, allowing us to create accurate but significantly smaller solutions.
AB - Geometric semantic genetic programming (GSGP) represents one of the most promising developments in the area of evolutionary computation (EC) in the last decade. The results achieved by incorporating semantic awareness in the evolutionary process demonstrate the impact that geometric semantic operators have brought to the field of EC. An improvement to the geometric semantic mutation (GSM) operator is proposed, inspired by the results achieved by batch normalization in deep learning. While, in one of its most used versions, GSM relies on the use of the sigmoid function to constrain the semantics of two random programs responsible for perturbing the parent’s semantics, here a different approach is followed, which allows reducing the size of the resulting programs and overcoming the issues associated with the use of the sigmoid function, as commonly done in deep learning. The idea is to consider a single random program and use it to perturb the parent’s semantics only after standardization or normalization. The experimental results demonstrate the suitability of the proposed approach: despite its simplicity, the presented GSM variants outperform standard GSGP on the studied benchmarks, with a difference in terms of performance that is statistically significant. Furthermore, the individuals generated by the new GSM variants are easier to simplify, allowing us to create accurate but significantly smaller solutions.
KW - Geometric semantic mutation
KW - Internal covariate shift
KW - Sigmoid distribution bias
KW - Model simplification
UR - https://www.webofscience.com/wos/woscc/full-record/WOS:001157627600001
UR - http://www.scopus.com/inward/record.url?scp=85185001897&partnerID=8YFLogxK
U2 - 10.1007/s10710-024-09479-1
DO - 10.1007/s10710-024-09479-1
M3 - Article
SN - 1389-2576
VL - 25
SP - 1
EP - 29
JO - Genetic Programming And Evolvable Machines
JF - Genetic Programming And Evolvable Machines
M1 - 6
ER -