Globally Convergent Inexact Generalized Newton Methods with Decreasing Norm of the Gradient

Author(s)

Abstract

In this paper, motivated by the Martinez and Qi methods [1], we propose one type of globally convergent inexact generalized Newton methods to solve unconstrained optimization problems in which the objective functions are not twice differentiable, but have LC gradient. They make the norm of the gradient decreasing. These methods are implementable and globally convergent. We prove that the algorithms have superlinear convergence rates under some mild conditions.
The methods may also be used to solve nonsmooth equations.

About this article

Abstract View

  • 32258

Pdf View

  • 3569

How to Cite

Globally Convergent Inexact Generalized Newton Methods with Decreasing Norm of the Gradient. (2002). Journal of Computational Mathematics, 20(3), 289-300. https://global-sci.com/index.php/JCM/article/view/11492