Geometrical inverse matrix approximation for least-squares problems and acceleration strategies

Jean Paul Chehab, Marcos Raydan

Research output: Contribution to journalArticlepeer-review

Abstract

We extend the geometrical inverse approximation approach to the linear least-squares scenario. For that, we focus on the minimization of 1 − cos (X(ATA) , I) , where A is a full-rank matrix of size m × n, with m ≥ n, and X is an approximation of the inverse of ATA. In particular, we adapt the recently published simplified gradient-type iterative scheme MinCos to the least-squares problem. In addition, we combine the generated convergent sequence of matrices with well-known acceleration strategies based on recently developed matrix extrapolation methods, and also with some line search acceleration schemes which are based on selecting an appropriate steplength at each iteration. A set of numerical experiments, including large-scale problems, are presented to illustrate the performance of the different accelerations strategies.

Original languageEnglish
Pages (from-to)1213-1231
JournalNumerical Algorithms
Volume85
DOIs
Publication statusPublished - 2020

Keywords

  • Gradient-type methods
  • Inverse approximation
  • Matrix acceleration techniques

Fingerprint

Dive into the research topics of 'Geometrical inverse matrix approximation for least-squares problems and acceleration strategies'. Together they form a unique fingerprint.

Cite this