•  
  •  
 

Abstract

Three-term conjugate gradient (TTCG) methods have been extensively studied for optimization within Euclidean geometry, with some inherently satisfying the sufficient descent property, thus enhancing their theoretical superiority. However, other TTCG methods have been developed that incorporate restarting the search direction, simplify the steepest descent approach, or rely on the convexity assumptions of f to establish their convergence results. This paper introduces the Riemannian three-term conjugate gradient (RTTCG) methods. The search direction in these RTTCG methods consistently satisfies the sufficient descent condition, regardless of the line search employed, and does so without requiring a restart mechanism. By utilizing retraction and vector transport operators, these methods ensure global convergence through a scaled version of the nonexpansive condition and without assuming convexity of the objective function. Using the Pymanopt package, we evaluate the effectiveness of the RTTCG methods by applying them to three optimization problems on manifolds. Performance analysis is performed using the Dolan and Moré technique in different experimental instances. Our proposed methods demonstrate promising results compared to several well-known Riemannian conjugate gradient methods. Additionally, the RTTCG methods are formulated on a product manifold to solve the Gough-Stewart platform as a Riemannian optimization problem. The simulations conducted confirm their effectiveness in addressing this challenge.

Share

COinS