TY - JOUR
T1 - SLiSeS
T2 - subsampled line search spectral gradient method for finite sums
AU - Bellavia, Stefania
AU - Krejić, Nataša
AU - Krklec Jerinkić, Nataša
AU - Raydan, Marcos
N1 - info:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F00297%2F2020/PT#
info:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDP%2F00297%2F2020/PT#
Funding information:
The first author acknowledges financial support received by the INdAM GNCS and by PNRR– Missione 4 Istruzione e Ricerca – Componente C2 Investimento 1.1, Fondo per il Pro-gramma Nazionale di Ricerca e Progetti di Rilevante Interesse Nazionale (PRIN) fundedby the European Commission under the NextGeneration EU programme, project ‘Advancedoptimization METhods for automated central veIn Sign detection in multiple sclerosis frommagneTic resonAnce imaging (AMETISTA)’, code: P2022J9SNP, MUR D.D. financing decreen. 1379 of 1st September 2023 (CUP E53D23017980001), project ‘Numerical Optimizationwith Adaptive Accuracy and Applications to Machine Learning’, code: 2022N3ZNAX MURD.D. financing decree n. 973 of 30th June 2023 (CUP B53D23012670006). The second and third authors were financially supported by the Science Fund of the Republic of Serbia, Grant no. 7359, Project LASCADO. The fourth author was financially supported by Fundação para a Ciência e a Tecnologia (Portuguese Foundation for Science and Technology) under the scope of the projects UIDB/MAT/00297/2020 (doi.org/10.54499/UIDB/00297/2020), and UIDP/MAT/00297/2020 (doi.org/10.54499/UIDP/00297/2020) (Centro de Matemática e Aplicações)
Publisher Copyright:
© 2024 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.
PY - 2024/12/9
Y1 - 2024/12/9
N2 - The spectral gradient method is known to be a powerful low-cost tool for solving large-scale optimization problems. In this paper, our goal is to exploit its advantages in the stochastic optimization framework, especially in the case of mini-batch subsampling that is often used in big data settings. To allow the spectral coefficient to properly explore the underlying approximate Hessian spectrum, we keep the same subsample for a prefixed number of iterations before subsampling again. We analyse the required algorithmic features and the conditions for almost sure convergence, and present initial numerical results that show the advantages of the proposed method.
AB - The spectral gradient method is known to be a powerful low-cost tool for solving large-scale optimization problems. In this paper, our goal is to exploit its advantages in the stochastic optimization framework, especially in the case of mini-batch subsampling that is often used in big data settings. To allow the spectral coefficient to properly explore the underlying approximate Hessian spectrum, we keep the same subsample for a prefixed number of iterations before subsampling again. We analyse the required algorithmic features and the conditions for almost sure convergence, and present initial numerical results that show the advantages of the proposed method.
KW - Finite sum minimization
KW - line search
KW - spectral gradient methods
KW - subsampling
UR - http://www.scopus.com/inward/record.url?scp=85211448291&partnerID=8YFLogxK
U2 - 10.1080/10556788.2024.2426620
DO - 10.1080/10556788.2024.2426620
M3 - Article
AN - SCOPUS:85211448291
SN - 1055-6788
SP - 1
EP - 26
JO - Optimization Methods and Software
JF - Optimization Methods and Software
ER -