Abstract
Three-term conjugate gradient (TTCG) methods have been extensively studied for optimization within Euclidean geometry, with some inherently satisfying the sufficient descent property, thus enhancing their theoretical superiority. However, other TTCG methods have been developed that incorporate restarting the search direction, simplify the steepest descent approach, or rely on the convexity assumptions of f to establish their convergence results. This paper introduces the Riemannian three-term conjugate gradient (RTTCG) methods. The search direction in these RTTCG methods consistently satisfies the sufficient descent condition, regardless of the line search employed, and does so without requiring a restart mechanism. By utilizing retraction and vector transport operators, these methods ensure global convergence through a scaled version of the nonexpansive condition and without assuming convexity of the objective function. Using the Pymanopt package, we evaluate the effectiveness of the RTTCG methods by applying them to three optimization problems on manifolds. Performance analysis is performed using the Dolan and Moré technique in different experimental instances. Our proposed methods demonstrate promising results compared to several well-known Riemannian conjugate gradient methods. Additionally, the RTTCG methods are formulated on a product manifold to solve the Gough-Stewart platform as a Riemannian optimization problem. The simulations conducted confirm their effectiveness in addressing this challenge.
Recommended Citation
Salihu, Nasiru; Kumam, Poom; Wang, Lin; and Salisu, Sani
(2025)
"Some new three-term conjugate gradient methods for Riemannian optimization with application to the Gough-Stewart platform,"
Mathematical Modelling and Numerical Simulation with Applications: Vol. 5:
Iss.
3, Article 2.
DOI: https://doi.org/10.53391/2791-8564.1001
Available at:
https://mmnsa.researchcommons.org/journal/vol5/iss3/2